Set up: load these packages!

library(umap)
library(caret)
library(DT)
library(tidyverse)
library(e1071)
library(gridExtra)
library(mda)
library(randomForest)
library(neuralnet)

MakML: Introduction to Machine Learning

Below is the code for the following modules:

  1. Introduction to R
  2. Biomarker development
  3. Dimension Reduction
  4. Visualizing High Dimensional Data in R
  5. Support Vector Machines
  6. Decision Trees and Random Forests
  7. Neural Networks
  8. Additional Practice

Introduction to R

To get started, you will need the following dependencies (R packages to install and datasets to download):

  1. Install R and R Studio
  2. Please install the following R packages for Data Management: tidyverse, DT, and gridExtra. In R:
install.packages(c("tidyverse", "DT", "gridExtra"))
  1. Please install the following R packages for Machine learning: umap, mda caret, e1071, rpart, randomForest, neuralnet. In R:
install.packages(c("umap","mda","caret", "e1071", "rpart", "randomForest", "neuralnet"))
  1. You will need the following datasets: heights.rds, olive.rds, and TBnanostring.rds

Data Frames

A large proportion of data analysis challenges start with data stored in a data frame. For example, we stored the data for our motivating example in a data frame. You can access this dataset by loading TBNanostring.rds object in R:

TBnanostring <- readRDS("TBnanostring.rds")

In RStudio we can view the data with the View function:

View(TBnanostring)

Or in RMarkdown you can use the datatable function from the DT package:

datatable(TBnanostring)

You will notice that the TB status is found in the first column of the data frame, followed by the genes in the subsequent columns. The rows represent each individual patient.

Dimension Reduction

PCA Example

Here is the code for applying PCA to the Nanostring dataset:

pca_out <- prcomp(TBnanostring[,-1])
names(pca_out)
## [1] "sdev"     "rotation" "center"   "scale"    "x"

Here is a summary of the explained variation from the PCA:

round(pca_out$sdev^2/sum(pca_out$sdev^2),3)
##   [1] 0.468 0.073 0.058 0.037 0.033 0.024 0.021 0.018 0.016 0.015 0.013 0.013
##  [13] 0.011 0.010 0.009 0.009 0.008 0.008 0.008 0.007 0.007 0.007 0.006 0.006
##  [25] 0.006 0.006 0.005 0.005 0.005 0.004 0.004 0.004 0.004 0.003 0.003 0.003
##  [37] 0.003 0.003 0.003 0.003 0.002 0.002 0.002 0.002 0.002 0.002 0.002 0.002
##  [49] 0.002 0.002 0.002 0.002 0.001 0.001 0.001 0.001 0.001 0.001 0.001 0.001
##  [61] 0.001 0.001 0.001 0.001 0.001 0.001 0.001 0.001 0.001 0.001 0.001 0.001
##  [73] 0.001 0.001 0.001 0.001 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
##  [85] 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000
##  [97] 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000

And the cumulative variation explained:

round(cumsum(pca_out$sdev^2)/sum(pca_out$sdev^2),3)
##   [1] 0.468 0.541 0.599 0.636 0.669 0.694 0.714 0.732 0.747 0.762 0.776 0.788
##  [13] 0.799 0.810 0.819 0.827 0.836 0.844 0.851 0.859 0.866 0.872 0.879 0.885
##  [25] 0.891 0.896 0.902 0.906 0.911 0.915 0.919 0.923 0.927 0.931 0.934 0.937
##  [37] 0.940 0.943 0.946 0.949 0.951 0.953 0.956 0.958 0.960 0.962 0.964 0.966
##  [49] 0.967 0.969 0.971 0.972 0.974 0.975 0.976 0.978 0.979 0.980 0.981 0.982
##  [61] 0.983 0.984 0.985 0.986 0.987 0.988 0.988 0.989 0.990 0.990 0.991 0.992
##  [73] 0.992 0.993 0.993 0.994 0.994 0.995 0.995 0.995 0.996 0.996 0.997 0.997
##  [85] 0.997 0.997 0.998 0.998 0.998 0.998 0.998 0.999 0.999 0.999 0.999 0.999
##  [97] 0.999 0.999 1.000 1.000 1.000 1.000 1.000 1.000 1.000 1.000 1.000

Now we will make a dataframe with the PCs for later use!

pca_reduction <- as.data.frame(pca_out$x)
pca_reduction$Condition <- as.factor(TBnanostring$TB_Status)
datatable(pca_reduction)

UMAP Example

set.seed(0) ## need to set the seed or results might be different
umap_out <- umap(TBnanostring[,-1])
names(umap_out)
## [1] "layout" "data"   "knn"    "config"

Now we will make a dataframe with the UMAP results for later use!

umap_reduction <- as.data.frame(umap_out$layout)
umap_reduction$Class <- as.factor(TBnanostring$TB_Status)
datatable(umap_reduction)

Visualizing data using ggplot2

Below is a step-by-step tutorial for making PCA and UMAP plots using ggplot2.

PCA Example

Step 0

We want to make the following plot using ggplot2.

Step 1

## Read in the data
TBnanostring <- readRDS("TBnanostring.rds")

Step 2

## Read in the data
TBnanostring <- readRDS("TBnanostring.rds")

## Apply PCA
pca_out <- prcomp(TBnanostring[,-1])

Step 3

## Read in the data
TBnanostring <- readRDS("TBnanostring.rds")

## Apply PCA
pca_out <- prcomp(TBnanostring[,-1])

## Make a dataframe with the results for plotting
pca_reduction <- as.data.frame(pca_out$x)
pca_reduction$Condition <- as.factor(TBnanostring$TB_Status)

Step 4

## Read in the data
TBnanostring <- readRDS("TBnanostring.rds")

## Apply PCA
pca_out <- prcomp(TBnanostring[,-1])

## Make a dataframe with the results for plotting
pca_reduction <- as.data.frame(pca_out$x)
pca_reduction$Condition <- as.factor(TBnanostring$TB_Status)

## Initialize the plot
pca_reduction %>% ggplot()

Step 5

## Read in the data
TBnanostring <- readRDS("TBnanostring.rds")

## Apply PCA
pca_out <- prcomp(TBnanostring[,-1])

## Make a dataframe with the results for plotting
pca_reduction <- as.data.frame(pca_out$x)
pca_reduction$Condition <- as.factor(TBnanostring$TB_Status)

## Add your geometry layer with x and y aesthetics
pca_reduction %>% ggplot() + 
  geom_point(aes(x=PC1, y=PC2)) 

Step 6

## Read in the data
TBnanostring <- readRDS("TBnanostring.rds")

## Apply PCA
pca_out <- prcomp(TBnanostring[,-1])

## Make a dataframe with the results for plotting
pca_reduction <- as.data.frame(pca_out$x)
pca_reduction$Condition <- as.factor(TBnanostring$TB_Status)

## Change the shape of the points
pca_reduction %>% ggplot() + 
  geom_point(aes(x=PC1, y=PC2), shape=1)  

Step 7

## Read in the data
TBnanostring <- readRDS("TBnanostring.rds")

## Apply PCA
pca_out <- prcomp(TBnanostring[,-1])

## Make a dataframe with the results for plotting
pca_reduction <- as.data.frame(pca_out$x)
pca_reduction$Condition <- as.factor(TBnanostring$TB_Status)

## Change color of points (add a mapping aesthetic) 
pca_reduction %>% ggplot() + 
  geom_point(aes(x=PC1, y=PC2, color=Condition), shape=1)

Step 8

## Read in the data
TBnanostring <- readRDS("TBnanostring.rds")

## Apply PCA
pca_out <- prcomp(TBnanostring[,-1])

## Make a dataframe with the results for plotting
pca_reduction <- as.data.frame(pca_out$x)
pca_reduction$Condition <- as.factor(TBnanostring$TB_Status)

## Add labels, title, and theme
pca_reduction %>% ggplot() + 
  geom_point(aes(x=PC1, y=PC2, color=Condition), shape=1) + 
  xlab("UMAP 1") + ylab("UMAP 2") + ggtitle("UMAP Plot") +
  theme(plot.title = element_text(hjust = 0.5))  

UMAP Example

Here is the final UMAP plot

## Read in data
TBnanostring <- readRDS("TBnanostring.rds")

## Apply UMAP reduction
set.seed(0)
library(umap)
umap_out <- umap(TBnanostring[,-1])

## Make dataframe for plotting in tidy format
umap_reduction <- as.data.frame(umap_out$layout)
umap_reduction$Condition <- as.factor(TBnanostring$TB_Status)

## Plot results with ggpplot
umap_reduction %>% ggplot() + 
  geom_point(aes(x=V1, y=V2, color=Condition), shape=1) + 
  xlab("UMAP 1") + ylab("UMAP 2") + ggtitle("UMAP Plot") +
  theme(plot.title = element_text(hjust = 0.5))  

Using the caret package

The caret package in R has several useful functions for building and assessing machine learning methods. It tries to consolidate many machine learning tools to provide a consistent syntax.

Using the height data

For a first example, we use the height data in dslabs:

heights <- readRDS("heights.rds")
datatable(heights)
boxplot(heights$height ~ heights$sex)

Partition dataset

The caret package includes the function createDataPartition that helps us generates indexes for randomly splitting the data into training and test sets:

set.seed(2007)
test_index <- createDataPartition(heights$sex, times = 1, 
                                  p = 0.5, list = FALSE)
test_set <- heights[test_index, ]
train_set <- heights[-test_index, ]

The argument times is used to define how many random samples of indexes to return, p is used to define what proportion of the data is represented by the index, and list is used to decide if we want the indexes returned as a list or not.

Simple predictor

Exploratory data analysis suggests we can because, on average, males are slightly taller than females:

heights %>% group_by(sex) %>% 
  summarize(mean(height), sd(height))
## # A tibble: 2 × 3
##   sex    `mean(height)` `sd(height)`
##   <fct>           <dbl>        <dbl>
## 1 Female           64.9         3.76
## 2 Male             69.3         3.61

Let’s try predicting with a simple approach: predict Male if height is within two standard deviations from the average male. The overall accuracy is 0.78 in the test set:

pred_test <- ifelse(test_set$height > 62, "Male", "Female") %>% 
  factor(levels = levels(test_set$sex))
confusionMatrix(pred_test, test_set$sex)
## Confusion Matrix and Statistics
## 
##           Reference
## Prediction Female Male
##     Female     22   18
##     Male       97  388
##                                           
##                Accuracy : 0.781           
##                  95% CI : (0.7431, 0.8156)
##     No Information Rate : 0.7733          
##     P-Value [Acc > NIR] : 0.3607          
##                                           
##                   Kappa : 0.1836          
##                                           
##  Mcnemar's Test P-Value : 3.502e-13       
##                                           
##             Sensitivity : 0.18487         
##             Specificity : 0.95567         
##          Pos Pred Value : 0.55000         
##          Neg Pred Value : 0.80000         
##              Prevalence : 0.22667         
##          Detection Rate : 0.04190         
##    Detection Prevalence : 0.07619         
##       Balanced Accuracy : 0.57027         
##                                           
##        'Positive' Class : Female          
## 

Use the train function

The caret package currently includes 237+ different machine learning methods, which can be applied using the train function. These are summarized in the caret package manual.

Keep in mind that caret does not include the needed packages and, to implement a package through caret, you still need to install the library. For example:

height_glm <- train(sex ~ height, method = "glm", data=train_set)
confusionMatrix(predict(height_glm, train_set), 
                train_set$sex)
## Confusion Matrix and Statistics
## 
##           Reference
## Prediction Female Male
##     Female     61   18
##     Male       58  388
##                                           
##                Accuracy : 0.8552          
##                  95% CI : (0.8222, 0.8842)
##     No Information Rate : 0.7733          
##     P-Value [Acc > NIR] : 1.679e-06       
##                                           
##                   Kappa : 0.5314          
##                                           
##  Mcnemar's Test P-Value : 7.691e-06       
##                                           
##             Sensitivity : 0.5126          
##             Specificity : 0.9557          
##          Pos Pred Value : 0.7722          
##          Neg Pred Value : 0.8700          
##              Prevalence : 0.2267          
##          Detection Rate : 0.1162          
##    Detection Prevalence : 0.1505          
##       Balanced Accuracy : 0.7341          
##                                           
##        'Positive' Class : Female          
## 

SVMs (Linear)

Generate dataset

First generate some data in 2 dimensions, and make them a little separated:

set.seed(10111)
x = matrix(rnorm(40), 20, 2)
y = rep(c(-1, 1), c(10, 10))
x[y == 1,] = x[y == 1,] + 1
plot(x, col = y + 3, pch = 19)

Apply SVM

We will use the e1071 package which contains the svm function that works on the dataframe (\(y\) needs to be a factor variable). Printing the svmfit gives its summary.

dat = data.frame(x, y = as.factor(y))
svmfit = svm(y ~ ., data = dat, kernel = "linear", cost = 10, scale = FALSE)
print(svmfit)
## 
## Call:
## svm(formula = y ~ ., data = dat, kernel = "linear", cost = 10, scale = FALSE)
## 
## 
## Parameters:
##    SVM-Type:  C-classification 
##  SVM-Kernel:  linear 
##        cost:  10 
## 
## Number of Support Vectors:  6

You can see that the number of support vectors is 6 - they are the points that are close to the boundary or on the wrong side of the boundary.

Plot SVM (version 1)

There’s a generic plot function for SVM that shows the decision boundary, as you can see below. It doesn’t seem there’s much control over the colors. It breaks with convention since it puts x2 on the horizontal axis and x1 on the vertical axis.

plot(svmfit, dat)

Plot SVM (version 2)

Or plotting it more cleanly:

make.grid = function(x, n = 75) {
  grange = apply(x, 2, range)
  x1 = seq(from = grange[1,1], to = grange[2,1], length = n)
  x2 = seq(from = grange[1,2], to = grange[2,2], length = n)
  expand.grid(X1 = x1, X2 = x2)
}
xgrid = make.grid(x)
ygrid = predict(svmfit, xgrid)
plot(xgrid, col = c("red","blue")[as.numeric(ygrid)], pch = 20, cex = .2)
points(x, col = y + 3, pch = 19)
points(x[svmfit$index,], pch = 5, cex = 2)

Unfortunately, the svm function is not too friendly, in that you have to do some work to get back the linear coefficients. The reason is probably that this only makes sense for linear kernels, and the function is more general. So let’s use a formula to extract the coefficients more efficiently. You extract \(\beta\) and \(\beta_0\), which are the linear coefficients.

beta = drop(t(svmfit$coefs)%*%x[svmfit$index,])
beta0 = svmfit$rho

Now you can replot the points on the grid, then put the points back in (including the support vector points). Then you can use the coefficients to draw the decision boundary using a simple equation of the form:

\[\beta_0+x_1\beta_1+x_2\beta_2=0\]

Now plotting the lines on the graph:

plot(xgrid, col = c("red", "blue")[as.numeric(ygrid)], pch = 20, cex = .2)
points(x, col = y + 3, pch = 19)
points(x[svmfit$index,], pch = 5, cex = 2)
abline(beta0 / beta[2], -beta[1] / beta[2])
abline((beta0 - 1) / beta[2], -beta[1] / beta[2], lty = 2)
abline((beta0 + 1) / beta[2], -beta[1] / beta[2], lty = 2)

SVM Nanostring data

Remember the PCA dimension reduction of the TB Nanostring dataset. The points are colored based on TB status.

Now let’s try an SVM on the PCs of the Nanostring data

# use only the first 2 PCs
dat = data.frame(y = pca_reduction$Condition, 
                 pca_reduction[,1:2])

fit = svm(y ~ ., data = dat, scale = FALSE, 
          kernel = "linear", cost = 10)
print(fit)
## 
## Call:
## svm(formula = y ~ ., data = dat, kernel = "linear", cost = 10, scale = FALSE)
## 
## 
## Parameters:
##    SVM-Type:  C-classification 
##  SVM-Kernel:  linear 
##        cost:  10 
## 
## Number of Support Vectors:  52

We can evaluate the predictor with a Confusion matrix:

confusionMatrix(dat$y,predict(fit,dat))
## Confusion Matrix and Statistics
## 
##           Reference
## Prediction TB LTBI
##       TB   69   10
##       LTBI  8   92
##                                           
##                Accuracy : 0.8994          
##                  95% CI : (0.8457, 0.9393)
##     No Information Rate : 0.5698          
##     P-Value [Acc > NIR] : <2e-16          
##                                           
##                   Kappa : 0.7955          
##                                           
##  Mcnemar's Test P-Value : 0.8137          
##                                           
##             Sensitivity : 0.8961          
##             Specificity : 0.9020          
##          Pos Pred Value : 0.8734          
##          Neg Pred Value : 0.9200          
##              Prevalence : 0.4302          
##          Detection Rate : 0.3855          
##    Detection Prevalence : 0.4413          
##       Balanced Accuracy : 0.8990          
##                                           
##        'Positive' Class : TB              
## 

Plotting Nanostring data:

plot(fit,dat,PC2~PC1)

And plotting the results more cleanly:

SVMs (Non-linear)

Example 1 (polynomial kernel)

Apply polynomial SVM

Now let’s apply a non-linear (polynomial) SVM to our prior simulated dataset.

dat = data.frame(x, y = as.factor(y))
svmfit = svm(y ~ ., data = dat, 
             kernel = "polynomial", cost = 10, scale = FALSE)
print(svmfit)
## 
## Call:
## svm(formula = y ~ ., data = dat, kernel = "polynomial", cost = 10, 
##     scale = FALSE)
## 
## 
## Parameters:
##    SVM-Type:  C-classification 
##  SVM-Kernel:  polynomial 
##        cost:  10 
##      degree:  3 
##      coef.0:  0 
## 
## Number of Support Vectors:  4

Plot results

Plotting the result:

Example 2 (radial basis kernel)

Load data

Here is a more complex example from Elements of Statistical Learning (from the mda R package), where the decision boundary needs to be non-linear and there is no clear separation.

rm(x,y)
data(ESL.mixture)
attach(ESL.mixture)
names(ESL.mixture)
## [1] "x"        "y"        "xnew"     "prob"     "marginal" "px1"      "px2"     
## [8] "means"

Plotting the data:

plot(x, col = y + 1)

Apply SVM (radial basis kernel)

Now make a data frame with the response \(y\), and turn that into a factor. We will fit an SVM with radial kernel.

dat = data.frame(y = factor(y), x)
fit = svm(factor(y) ~ ., data = dat, scale = FALSE, 
          kernel = "radial", cost = 5)
print(fit)
## 
## Call:
## svm(formula = factor(y) ~ ., data = dat, kernel = "radial", cost = 5, 
##     scale = FALSE)
## 
## 
## Parameters:
##    SVM-Type:  C-classification 
##  SVM-Kernel:  radial 
##        cost:  5 
## 
## Number of Support Vectors:  103

Plotting the result

It’s time to create a grid and predictions. We use expand.grid to create the grid, predict each of the values on the grid, and plot them:

xgrid = expand.grid(X1 = px1, X2 = px2)
ygrid = predict(fit, xgrid)
plot(xgrid, col = as.numeric(ygrid), pch = 20, cex = .2)
points(x, col = y + 1, pch = 19)

Plotting with a contour:

Decision Trees

Olive oil

We will use a new dataset that includes the breakdown of the composition of olive oil into 8 fatty acids:

olive <- readRDS("olive.rds")
olive <- select(olive, -area) #remove the `area` column--don't use it
olive %>% datatable()

We will try to predict the region using the fatty acid composition values as predictors.

table(olive$region)
## 
## Northern Italy       Sardinia Southern Italy 
##            151             98            323

Boxplots

A bit of data exploration reveals that we should be able to do even better: note that eicosenoic is only in Southern Italy and linoleic separates Northern Italy from Sardinia.

olive %>% gather(fatty_acid, percentage, -region) %>%
  ggplot(aes(region, percentage, fill = region)) +
  geom_boxplot() +
  facet_wrap(~fatty_acid, scales = "free", ncol = 4) +
  theme(axis.text.x = element_blank(), legend.position="bottom")

Paritions

This implies that we should be able to build an algorithm that predicts perfectly! Let’s try plotting the values for eicosenoic and linoleic.

olive %>% 
  ggplot(aes(eicosenoic, linoleic, color = region)) + 
  geom_point(size=2) +
  geom_vline(xintercept = 0.065, lty = 2) + 
  geom_segment(x = -0.2, y = 10.54, xend = 0.065, yend = 10.54, 
               color = "black", lty = 2)

Decision tree

Let’s define a decision rule: If eicosenoic is larger than 0.065, predict Southern Italy. If not, then if linoleic is larger than \(10.535\), predict Sardinia, otherwise predict Northern Italy. We can draw this decision tree:

train_rpart <- train(region ~ ., method = "rpart", data = olive)
plot(train_rpart$finalModel, margin = 0.1)
text(train_rpart$finalModel, cex = 0.75)

Random Forests

Summarize data

For this example we will use the Iris data in R

datatable(iris)
iris %>% ggplot(aes(Petal.Width, Petal.Length, color = Species)) +
  geom_point()

Select train/test

set.seed(0)
test_index <- createDataPartition(iris$Species, times = 1, 
                                  p = 0.3, list = FALSE)
iris_test <- iris[test_index, ]
iris_train <- iris[-test_index, ]

Apply Random Forest

iris_classifier <- randomForest(Species~., data = iris_train, 
                    importance=T)
iris_classifier
## 
## Call:
##  randomForest(formula = Species ~ ., data = iris_train, importance = T) 
##                Type of random forest: classification
##                      Number of trees: 500
## No. of variables tried at each split: 2
## 
##         OOB estimate of  error rate: 2.86%
## Confusion matrix:
##            setosa versicolor virginica class.error
## setosa         35          0         0  0.00000000
## versicolor      0         34         1  0.02857143
## virginica       0          2        33  0.05714286

Results summary

plot(iris_classifier)

importance(iris_classifier)
##                 setosa versicolor virginica MeanDecreaseAccuracy
## Sepal.Length  6.247640   8.383367  5.865249            10.618792
## Sepal.Width   4.490071   3.098781  3.295522             5.011737
## Petal.Length 22.641994  36.724624 25.382407            34.376688
## Petal.Width  21.561037  31.744843 24.616309            30.800022
##              MeanDecreaseGini
## Sepal.Length         5.527267
## Sepal.Width          1.786515
## Petal.Length        31.838258
## Petal.Width         30.130208
varImpPlot(iris_classifier)

Predictions

predicted_table <- predict(iris_classifier, iris_test[,-5])
confusionMatrix(predicted_table, iris_test[,5])
## Confusion Matrix and Statistics
## 
##             Reference
## Prediction   setosa versicolor virginica
##   setosa         15          0         0
##   versicolor      0         13         1
##   virginica       0          2        14
## 
## Overall Statistics
##                                          
##                Accuracy : 0.9333         
##                  95% CI : (0.8173, 0.986)
##     No Information Rate : 0.3333         
##     P-Value [Acc > NIR] : < 2.2e-16      
##                                          
##                   Kappa : 0.9            
##                                          
##  Mcnemar's Test P-Value : NA             
## 
## Statistics by Class:
## 
##                      Class: setosa Class: versicolor Class: virginica
## Sensitivity                 1.0000            0.8667           0.9333
## Specificity                 1.0000            0.9667           0.9333
## Pos Pred Value              1.0000            0.9286           0.8750
## Neg Pred Value              1.0000            0.9355           0.9655
## Prevalence                  0.3333            0.3333           0.3333
## Detection Rate              0.3333            0.2889           0.3111
## Detection Prevalence        0.3333            0.3111           0.3556
## Balanced Accuracy           1.0000            0.9167           0.9333

Neural Networks

Example 1: Job Placement

Placement data (toy example)

Suppose we have students’ technical knowledge (TKS), communication skill score (CSS), and placement status (Placed):

TKS=c(20,10,30,20,80,30)
CSS=c(90,20,40,50,50,80)
Placed=c(1,0,0,0,1,1)
df=data.frame(TKS,CSS,Placed)
datatable(df)

Fit Neural Network

Fit the multilayer perceptron neural network:

set.seed(0)
nn=neuralnet::neuralnet(Placed~TKS+CSS,data=df, hidden=3,
             act.fct = "logistic", linear.output = FALSE)
names(nn)
##  [1] "call"                "response"            "covariate"          
##  [4] "model.list"          "err.fct"             "act.fct"            
##  [7] "linear.output"       "data"                "exclude"            
## [10] "net.result"          "weights"             "generalized.weights"
## [13] "startweights"        "result.matrix"

Plot Neural Network

We can plot or neural network:

plot(nn, rep="best")

Predictions

Creating a test set:

TKS=c(30,40,85)
CSS=c(85,50,40)
test=data.frame(TKS,CSS)
datatable(test)
pred_nn = neuralnet::compute(nn,test)$net.result
pred_class <- ifelse(pred_nn>0.45, 1, 0)
pred_class
##      [,1]
## [1,]    0
## [2,]    0
## [3,]    1

Example 2: Iris data

NN on Iris data

Now, using the iris dataset:

set.seed(0)
nn_iris <- neuralnet(Species ~ ., data=iris, hidden=3)
nn_iris
## $call
## neuralnet(formula = Species ~ ., data = iris, hidden = 3)
## 
## $response
##     setosa versicolor virginica
## 1     TRUE      FALSE     FALSE
## 2     TRUE      FALSE     FALSE
## 3     TRUE      FALSE     FALSE
## 4     TRUE      FALSE     FALSE
## 5     TRUE      FALSE     FALSE
## 6     TRUE      FALSE     FALSE
## 7     TRUE      FALSE     FALSE
## 8     TRUE      FALSE     FALSE
## 9     TRUE      FALSE     FALSE
## 10    TRUE      FALSE     FALSE
## 11    TRUE      FALSE     FALSE
## 12    TRUE      FALSE     FALSE
## 13    TRUE      FALSE     FALSE
## 14    TRUE      FALSE     FALSE
## 15    TRUE      FALSE     FALSE
## 16    TRUE      FALSE     FALSE
## 17    TRUE      FALSE     FALSE
## 18    TRUE      FALSE     FALSE
## 19    TRUE      FALSE     FALSE
## 20    TRUE      FALSE     FALSE
## 21    TRUE      FALSE     FALSE
## 22    TRUE      FALSE     FALSE
## 23    TRUE      FALSE     FALSE
## 24    TRUE      FALSE     FALSE
## 25    TRUE      FALSE     FALSE
## 26    TRUE      FALSE     FALSE
## 27    TRUE      FALSE     FALSE
## 28    TRUE      FALSE     FALSE
## 29    TRUE      FALSE     FALSE
## 30    TRUE      FALSE     FALSE
## 31    TRUE      FALSE     FALSE
## 32    TRUE      FALSE     FALSE
## 33    TRUE      FALSE     FALSE
## 34    TRUE      FALSE     FALSE
## 35    TRUE      FALSE     FALSE
## 36    TRUE      FALSE     FALSE
## 37    TRUE      FALSE     FALSE
## 38    TRUE      FALSE     FALSE
## 39    TRUE      FALSE     FALSE
## 40    TRUE      FALSE     FALSE
## 41    TRUE      FALSE     FALSE
## 42    TRUE      FALSE     FALSE
## 43    TRUE      FALSE     FALSE
## 44    TRUE      FALSE     FALSE
## 45    TRUE      FALSE     FALSE
## 46    TRUE      FALSE     FALSE
## 47    TRUE      FALSE     FALSE
## 48    TRUE      FALSE     FALSE
## 49    TRUE      FALSE     FALSE
## 50    TRUE      FALSE     FALSE
## 51   FALSE       TRUE     FALSE
## 52   FALSE       TRUE     FALSE
## 53   FALSE       TRUE     FALSE
## 54   FALSE       TRUE     FALSE
## 55   FALSE       TRUE     FALSE
## 56   FALSE       TRUE     FALSE
## 57   FALSE       TRUE     FALSE
## 58   FALSE       TRUE     FALSE
## 59   FALSE       TRUE     FALSE
## 60   FALSE       TRUE     FALSE
## 61   FALSE       TRUE     FALSE
## 62   FALSE       TRUE     FALSE
## 63   FALSE       TRUE     FALSE
## 64   FALSE       TRUE     FALSE
## 65   FALSE       TRUE     FALSE
## 66   FALSE       TRUE     FALSE
## 67   FALSE       TRUE     FALSE
## 68   FALSE       TRUE     FALSE
## 69   FALSE       TRUE     FALSE
## 70   FALSE       TRUE     FALSE
## 71   FALSE       TRUE     FALSE
## 72   FALSE       TRUE     FALSE
## 73   FALSE       TRUE     FALSE
## 74   FALSE       TRUE     FALSE
## 75   FALSE       TRUE     FALSE
## 76   FALSE       TRUE     FALSE
## 77   FALSE       TRUE     FALSE
## 78   FALSE       TRUE     FALSE
## 79   FALSE       TRUE     FALSE
## 80   FALSE       TRUE     FALSE
## 81   FALSE       TRUE     FALSE
## 82   FALSE       TRUE     FALSE
## 83   FALSE       TRUE     FALSE
## 84   FALSE       TRUE     FALSE
## 85   FALSE       TRUE     FALSE
## 86   FALSE       TRUE     FALSE
## 87   FALSE       TRUE     FALSE
## 88   FALSE       TRUE     FALSE
## 89   FALSE       TRUE     FALSE
## 90   FALSE       TRUE     FALSE
## 91   FALSE       TRUE     FALSE
## 92   FALSE       TRUE     FALSE
## 93   FALSE       TRUE     FALSE
## 94   FALSE       TRUE     FALSE
## 95   FALSE       TRUE     FALSE
## 96   FALSE       TRUE     FALSE
## 97   FALSE       TRUE     FALSE
## 98   FALSE       TRUE     FALSE
## 99   FALSE       TRUE     FALSE
## 100  FALSE       TRUE     FALSE
## 101  FALSE      FALSE      TRUE
## 102  FALSE      FALSE      TRUE
## 103  FALSE      FALSE      TRUE
## 104  FALSE      FALSE      TRUE
## 105  FALSE      FALSE      TRUE
## 106  FALSE      FALSE      TRUE
## 107  FALSE      FALSE      TRUE
## 108  FALSE      FALSE      TRUE
## 109  FALSE      FALSE      TRUE
## 110  FALSE      FALSE      TRUE
## 111  FALSE      FALSE      TRUE
## 112  FALSE      FALSE      TRUE
## 113  FALSE      FALSE      TRUE
## 114  FALSE      FALSE      TRUE
## 115  FALSE      FALSE      TRUE
## 116  FALSE      FALSE      TRUE
## 117  FALSE      FALSE      TRUE
## 118  FALSE      FALSE      TRUE
## 119  FALSE      FALSE      TRUE
## 120  FALSE      FALSE      TRUE
## 121  FALSE      FALSE      TRUE
## 122  FALSE      FALSE      TRUE
## 123  FALSE      FALSE      TRUE
## 124  FALSE      FALSE      TRUE
## 125  FALSE      FALSE      TRUE
## 126  FALSE      FALSE      TRUE
## 127  FALSE      FALSE      TRUE
## 128  FALSE      FALSE      TRUE
## 129  FALSE      FALSE      TRUE
## 130  FALSE      FALSE      TRUE
## 131  FALSE      FALSE      TRUE
## 132  FALSE      FALSE      TRUE
## 133  FALSE      FALSE      TRUE
## 134  FALSE      FALSE      TRUE
## 135  FALSE      FALSE      TRUE
## 136  FALSE      FALSE      TRUE
## 137  FALSE      FALSE      TRUE
## 138  FALSE      FALSE      TRUE
## 139  FALSE      FALSE      TRUE
## 140  FALSE      FALSE      TRUE
## 141  FALSE      FALSE      TRUE
## 142  FALSE      FALSE      TRUE
## 143  FALSE      FALSE      TRUE
## 144  FALSE      FALSE      TRUE
## 145  FALSE      FALSE      TRUE
## 146  FALSE      FALSE      TRUE
## 147  FALSE      FALSE      TRUE
## 148  FALSE      FALSE      TRUE
## 149  FALSE      FALSE      TRUE
## 150  FALSE      FALSE      TRUE
## 
## $covariate
##        Sepal.Length Sepal.Width Petal.Length Petal.Width
##   [1,]          5.1         3.5          1.4         0.2
##   [2,]          4.9         3.0          1.4         0.2
##   [3,]          4.7         3.2          1.3         0.2
##   [4,]          4.6         3.1          1.5         0.2
##   [5,]          5.0         3.6          1.4         0.2
##   [6,]          5.4         3.9          1.7         0.4
##   [7,]          4.6         3.4          1.4         0.3
##   [8,]          5.0         3.4          1.5         0.2
##   [9,]          4.4         2.9          1.4         0.2
##  [10,]          4.9         3.1          1.5         0.1
##  [11,]          5.4         3.7          1.5         0.2
##  [12,]          4.8         3.4          1.6         0.2
##  [13,]          4.8         3.0          1.4         0.1
##  [14,]          4.3         3.0          1.1         0.1
##  [15,]          5.8         4.0          1.2         0.2
##  [16,]          5.7         4.4          1.5         0.4
##  [17,]          5.4         3.9          1.3         0.4
##  [18,]          5.1         3.5          1.4         0.3
##  [19,]          5.7         3.8          1.7         0.3
##  [20,]          5.1         3.8          1.5         0.3
##  [21,]          5.4         3.4          1.7         0.2
##  [22,]          5.1         3.7          1.5         0.4
##  [23,]          4.6         3.6          1.0         0.2
##  [24,]          5.1         3.3          1.7         0.5
##  [25,]          4.8         3.4          1.9         0.2
##  [26,]          5.0         3.0          1.6         0.2
##  [27,]          5.0         3.4          1.6         0.4
##  [28,]          5.2         3.5          1.5         0.2
##  [29,]          5.2         3.4          1.4         0.2
##  [30,]          4.7         3.2          1.6         0.2
##  [31,]          4.8         3.1          1.6         0.2
##  [32,]          5.4         3.4          1.5         0.4
##  [33,]          5.2         4.1          1.5         0.1
##  [34,]          5.5         4.2          1.4         0.2
##  [35,]          4.9         3.1          1.5         0.2
##  [36,]          5.0         3.2          1.2         0.2
##  [37,]          5.5         3.5          1.3         0.2
##  [38,]          4.9         3.6          1.4         0.1
##  [39,]          4.4         3.0          1.3         0.2
##  [40,]          5.1         3.4          1.5         0.2
##  [41,]          5.0         3.5          1.3         0.3
##  [42,]          4.5         2.3          1.3         0.3
##  [43,]          4.4         3.2          1.3         0.2
##  [44,]          5.0         3.5          1.6         0.6
##  [45,]          5.1         3.8          1.9         0.4
##  [46,]          4.8         3.0          1.4         0.3
##  [47,]          5.1         3.8          1.6         0.2
##  [48,]          4.6         3.2          1.4         0.2
##  [49,]          5.3         3.7          1.5         0.2
##  [50,]          5.0         3.3          1.4         0.2
##  [51,]          7.0         3.2          4.7         1.4
##  [52,]          6.4         3.2          4.5         1.5
##  [53,]          6.9         3.1          4.9         1.5
##  [54,]          5.5         2.3          4.0         1.3
##  [55,]          6.5         2.8          4.6         1.5
##  [56,]          5.7         2.8          4.5         1.3
##  [57,]          6.3         3.3          4.7         1.6
##  [58,]          4.9         2.4          3.3         1.0
##  [59,]          6.6         2.9          4.6         1.3
##  [60,]          5.2         2.7          3.9         1.4
##  [61,]          5.0         2.0          3.5         1.0
##  [62,]          5.9         3.0          4.2         1.5
##  [63,]          6.0         2.2          4.0         1.0
##  [64,]          6.1         2.9          4.7         1.4
##  [65,]          5.6         2.9          3.6         1.3
##  [66,]          6.7         3.1          4.4         1.4
##  [67,]          5.6         3.0          4.5         1.5
##  [68,]          5.8         2.7          4.1         1.0
##  [69,]          6.2         2.2          4.5         1.5
##  [70,]          5.6         2.5          3.9         1.1
##  [71,]          5.9         3.2          4.8         1.8
##  [72,]          6.1         2.8          4.0         1.3
##  [73,]          6.3         2.5          4.9         1.5
##  [74,]          6.1         2.8          4.7         1.2
##  [75,]          6.4         2.9          4.3         1.3
##  [76,]          6.6         3.0          4.4         1.4
##  [77,]          6.8         2.8          4.8         1.4
##  [78,]          6.7         3.0          5.0         1.7
##  [79,]          6.0         2.9          4.5         1.5
##  [80,]          5.7         2.6          3.5         1.0
##  [81,]          5.5         2.4          3.8         1.1
##  [82,]          5.5         2.4          3.7         1.0
##  [83,]          5.8         2.7          3.9         1.2
##  [84,]          6.0         2.7          5.1         1.6
##  [85,]          5.4         3.0          4.5         1.5
##  [86,]          6.0         3.4          4.5         1.6
##  [87,]          6.7         3.1          4.7         1.5
##  [88,]          6.3         2.3          4.4         1.3
##  [89,]          5.6         3.0          4.1         1.3
##  [90,]          5.5         2.5          4.0         1.3
##  [91,]          5.5         2.6          4.4         1.2
##  [92,]          6.1         3.0          4.6         1.4
##  [93,]          5.8         2.6          4.0         1.2
##  [94,]          5.0         2.3          3.3         1.0
##  [95,]          5.6         2.7          4.2         1.3
##  [96,]          5.7         3.0          4.2         1.2
##  [97,]          5.7         2.9          4.2         1.3
##  [98,]          6.2         2.9          4.3         1.3
##  [99,]          5.1         2.5          3.0         1.1
## [100,]          5.7         2.8          4.1         1.3
## [101,]          6.3         3.3          6.0         2.5
## [102,]          5.8         2.7          5.1         1.9
## [103,]          7.1         3.0          5.9         2.1
## [104,]          6.3         2.9          5.6         1.8
## [105,]          6.5         3.0          5.8         2.2
## [106,]          7.6         3.0          6.6         2.1
## [107,]          4.9         2.5          4.5         1.7
## [108,]          7.3         2.9          6.3         1.8
## [109,]          6.7         2.5          5.8         1.8
## [110,]          7.2         3.6          6.1         2.5
## [111,]          6.5         3.2          5.1         2.0
## [112,]          6.4         2.7          5.3         1.9
## [113,]          6.8         3.0          5.5         2.1
## [114,]          5.7         2.5          5.0         2.0
## [115,]          5.8         2.8          5.1         2.4
## [116,]          6.4         3.2          5.3         2.3
## [117,]          6.5         3.0          5.5         1.8
## [118,]          7.7         3.8          6.7         2.2
## [119,]          7.7         2.6          6.9         2.3
## [120,]          6.0         2.2          5.0         1.5
## [121,]          6.9         3.2          5.7         2.3
## [122,]          5.6         2.8          4.9         2.0
## [123,]          7.7         2.8          6.7         2.0
## [124,]          6.3         2.7          4.9         1.8
## [125,]          6.7         3.3          5.7         2.1
## [126,]          7.2         3.2          6.0         1.8
## [127,]          6.2         2.8          4.8         1.8
## [128,]          6.1         3.0          4.9         1.8
## [129,]          6.4         2.8          5.6         2.1
## [130,]          7.2         3.0          5.8         1.6
## [131,]          7.4         2.8          6.1         1.9
## [132,]          7.9         3.8          6.4         2.0
## [133,]          6.4         2.8          5.6         2.2
## [134,]          6.3         2.8          5.1         1.5
## [135,]          6.1         2.6          5.6         1.4
## [136,]          7.7         3.0          6.1         2.3
## [137,]          6.3         3.4          5.6         2.4
## [138,]          6.4         3.1          5.5         1.8
## [139,]          6.0         3.0          4.8         1.8
## [140,]          6.9         3.1          5.4         2.1
## [141,]          6.7         3.1          5.6         2.4
## [142,]          6.9         3.1          5.1         2.3
## [143,]          5.8         2.7          5.1         1.9
## [144,]          6.8         3.2          5.9         2.3
## [145,]          6.7         3.3          5.7         2.5
## [146,]          6.7         3.0          5.2         2.3
## [147,]          6.3         2.5          5.0         1.9
## [148,]          6.5         3.0          5.2         2.0
## [149,]          6.2         3.4          5.4         2.3
## [150,]          5.9         3.0          5.1         1.8
## 
## $model.list
## $model.list$response
## [1] "setosa"     "versicolor" "virginica" 
## 
## $model.list$variables
## [1] "Sepal.Length" "Sepal.Width"  "Petal.Length" "Petal.Width" 
## 
## 
## $err.fct
## function (x, y) 
## {
##     1/2 * (y - x)^2
## }
## <bytecode: 0x12b7c9af0>
## <environment: 0x11c810630>
## attr(,"type")
## [1] "sse"
## 
## $act.fct
## function (x) 
## {
##     1/(1 + exp(-x))
## }
## <bytecode: 0x12b7ce490>
## <environment: 0x11c814070>
## attr(,"type")
## [1] "logistic"
## 
## $linear.output
## [1] TRUE
## 
## $data
##     Sepal.Length Sepal.Width Petal.Length Petal.Width    Species
## 1            5.1         3.5          1.4         0.2     setosa
## 2            4.9         3.0          1.4         0.2     setosa
## 3            4.7         3.2          1.3         0.2     setosa
## 4            4.6         3.1          1.5         0.2     setosa
## 5            5.0         3.6          1.4         0.2     setosa
## 6            5.4         3.9          1.7         0.4     setosa
## 7            4.6         3.4          1.4         0.3     setosa
## 8            5.0         3.4          1.5         0.2     setosa
## 9            4.4         2.9          1.4         0.2     setosa
## 10           4.9         3.1          1.5         0.1     setosa
## 11           5.4         3.7          1.5         0.2     setosa
## 12           4.8         3.4          1.6         0.2     setosa
## 13           4.8         3.0          1.4         0.1     setosa
## 14           4.3         3.0          1.1         0.1     setosa
## 15           5.8         4.0          1.2         0.2     setosa
## 16           5.7         4.4          1.5         0.4     setosa
## 17           5.4         3.9          1.3         0.4     setosa
## 18           5.1         3.5          1.4         0.3     setosa
## 19           5.7         3.8          1.7         0.3     setosa
## 20           5.1         3.8          1.5         0.3     setosa
## 21           5.4         3.4          1.7         0.2     setosa
## 22           5.1         3.7          1.5         0.4     setosa
## 23           4.6         3.6          1.0         0.2     setosa
## 24           5.1         3.3          1.7         0.5     setosa
## 25           4.8         3.4          1.9         0.2     setosa
## 26           5.0         3.0          1.6         0.2     setosa
## 27           5.0         3.4          1.6         0.4     setosa
## 28           5.2         3.5          1.5         0.2     setosa
## 29           5.2         3.4          1.4         0.2     setosa
## 30           4.7         3.2          1.6         0.2     setosa
## 31           4.8         3.1          1.6         0.2     setosa
## 32           5.4         3.4          1.5         0.4     setosa
## 33           5.2         4.1          1.5         0.1     setosa
## 34           5.5         4.2          1.4         0.2     setosa
## 35           4.9         3.1          1.5         0.2     setosa
## 36           5.0         3.2          1.2         0.2     setosa
## 37           5.5         3.5          1.3         0.2     setosa
## 38           4.9         3.6          1.4         0.1     setosa
## 39           4.4         3.0          1.3         0.2     setosa
## 40           5.1         3.4          1.5         0.2     setosa
## 41           5.0         3.5          1.3         0.3     setosa
## 42           4.5         2.3          1.3         0.3     setosa
## 43           4.4         3.2          1.3         0.2     setosa
## 44           5.0         3.5          1.6         0.6     setosa
## 45           5.1         3.8          1.9         0.4     setosa
## 46           4.8         3.0          1.4         0.3     setosa
## 47           5.1         3.8          1.6         0.2     setosa
## 48           4.6         3.2          1.4         0.2     setosa
## 49           5.3         3.7          1.5         0.2     setosa
## 50           5.0         3.3          1.4         0.2     setosa
## 51           7.0         3.2          4.7         1.4 versicolor
## 52           6.4         3.2          4.5         1.5 versicolor
## 53           6.9         3.1          4.9         1.5 versicolor
## 54           5.5         2.3          4.0         1.3 versicolor
## 55           6.5         2.8          4.6         1.5 versicolor
## 56           5.7         2.8          4.5         1.3 versicolor
## 57           6.3         3.3          4.7         1.6 versicolor
## 58           4.9         2.4          3.3         1.0 versicolor
## 59           6.6         2.9          4.6         1.3 versicolor
## 60           5.2         2.7          3.9         1.4 versicolor
## 61           5.0         2.0          3.5         1.0 versicolor
## 62           5.9         3.0          4.2         1.5 versicolor
## 63           6.0         2.2          4.0         1.0 versicolor
## 64           6.1         2.9          4.7         1.4 versicolor
## 65           5.6         2.9          3.6         1.3 versicolor
## 66           6.7         3.1          4.4         1.4 versicolor
## 67           5.6         3.0          4.5         1.5 versicolor
## 68           5.8         2.7          4.1         1.0 versicolor
## 69           6.2         2.2          4.5         1.5 versicolor
## 70           5.6         2.5          3.9         1.1 versicolor
## 71           5.9         3.2          4.8         1.8 versicolor
## 72           6.1         2.8          4.0         1.3 versicolor
## 73           6.3         2.5          4.9         1.5 versicolor
## 74           6.1         2.8          4.7         1.2 versicolor
## 75           6.4         2.9          4.3         1.3 versicolor
## 76           6.6         3.0          4.4         1.4 versicolor
## 77           6.8         2.8          4.8         1.4 versicolor
## 78           6.7         3.0          5.0         1.7 versicolor
## 79           6.0         2.9          4.5         1.5 versicolor
## 80           5.7         2.6          3.5         1.0 versicolor
## 81           5.5         2.4          3.8         1.1 versicolor
## 82           5.5         2.4          3.7         1.0 versicolor
## 83           5.8         2.7          3.9         1.2 versicolor
## 84           6.0         2.7          5.1         1.6 versicolor
## 85           5.4         3.0          4.5         1.5 versicolor
## 86           6.0         3.4          4.5         1.6 versicolor
## 87           6.7         3.1          4.7         1.5 versicolor
## 88           6.3         2.3          4.4         1.3 versicolor
## 89           5.6         3.0          4.1         1.3 versicolor
## 90           5.5         2.5          4.0         1.3 versicolor
## 91           5.5         2.6          4.4         1.2 versicolor
## 92           6.1         3.0          4.6         1.4 versicolor
## 93           5.8         2.6          4.0         1.2 versicolor
## 94           5.0         2.3          3.3         1.0 versicolor
## 95           5.6         2.7          4.2         1.3 versicolor
## 96           5.7         3.0          4.2         1.2 versicolor
## 97           5.7         2.9          4.2         1.3 versicolor
## 98           6.2         2.9          4.3         1.3 versicolor
## 99           5.1         2.5          3.0         1.1 versicolor
## 100          5.7         2.8          4.1         1.3 versicolor
## 101          6.3         3.3          6.0         2.5  virginica
## 102          5.8         2.7          5.1         1.9  virginica
## 103          7.1         3.0          5.9         2.1  virginica
## 104          6.3         2.9          5.6         1.8  virginica
## 105          6.5         3.0          5.8         2.2  virginica
## 106          7.6         3.0          6.6         2.1  virginica
## 107          4.9         2.5          4.5         1.7  virginica
## 108          7.3         2.9          6.3         1.8  virginica
## 109          6.7         2.5          5.8         1.8  virginica
## 110          7.2         3.6          6.1         2.5  virginica
## 111          6.5         3.2          5.1         2.0  virginica
## 112          6.4         2.7          5.3         1.9  virginica
## 113          6.8         3.0          5.5         2.1  virginica
## 114          5.7         2.5          5.0         2.0  virginica
## 115          5.8         2.8          5.1         2.4  virginica
## 116          6.4         3.2          5.3         2.3  virginica
## 117          6.5         3.0          5.5         1.8  virginica
## 118          7.7         3.8          6.7         2.2  virginica
## 119          7.7         2.6          6.9         2.3  virginica
## 120          6.0         2.2          5.0         1.5  virginica
## 121          6.9         3.2          5.7         2.3  virginica
## 122          5.6         2.8          4.9         2.0  virginica
## 123          7.7         2.8          6.7         2.0  virginica
## 124          6.3         2.7          4.9         1.8  virginica
## 125          6.7         3.3          5.7         2.1  virginica
## 126          7.2         3.2          6.0         1.8  virginica
## 127          6.2         2.8          4.8         1.8  virginica
## 128          6.1         3.0          4.9         1.8  virginica
## 129          6.4         2.8          5.6         2.1  virginica
## 130          7.2         3.0          5.8         1.6  virginica
## 131          7.4         2.8          6.1         1.9  virginica
## 132          7.9         3.8          6.4         2.0  virginica
## 133          6.4         2.8          5.6         2.2  virginica
## 134          6.3         2.8          5.1         1.5  virginica
## 135          6.1         2.6          5.6         1.4  virginica
## 136          7.7         3.0          6.1         2.3  virginica
## 137          6.3         3.4          5.6         2.4  virginica
## 138          6.4         3.1          5.5         1.8  virginica
## 139          6.0         3.0          4.8         1.8  virginica
## 140          6.9         3.1          5.4         2.1  virginica
## 141          6.7         3.1          5.6         2.4  virginica
## 142          6.9         3.1          5.1         2.3  virginica
## 143          5.8         2.7          5.1         1.9  virginica
## 144          6.8         3.2          5.9         2.3  virginica
## 145          6.7         3.3          5.7         2.5  virginica
## 146          6.7         3.0          5.2         2.3  virginica
## 147          6.3         2.5          5.0         1.9  virginica
## 148          6.5         3.0          5.2         2.0  virginica
## 149          6.2         3.4          5.4         2.3  virginica
## 150          5.9         3.0          5.1         1.8  virginica
## 
## $exclude
## NULL
## 
## $net.result
## $net.result[[1]]
##                 [,1]          [,2]          [,3]
##   [1,]  1.000000e+00 -5.841901e-05  0.0001129900
##   [2,]  1.000000e+00 -5.839920e-05  0.0001129693
##   [3,]  1.000000e+00 -5.841043e-05  0.0001129811
##   [4,]  1.000000e+00 -5.838990e-05  0.0001129596
##   [5,]  1.000000e+00 -5.842000e-05  0.0001129911
##   [6,]  1.000000e+00 -5.840648e-05  0.0001129770
##   [7,]  1.000000e+00 -5.839817e-05  0.0001129683
##   [8,]  1.000000e+00 -5.841311e-05  0.0001129839
##   [9,]  1.000000e+00 -5.837457e-05  0.0001129436
##  [10,]  1.000000e+00 -5.841191e-05  0.0001129826
##  [11,]  1.000000e+00 -5.842180e-05  0.0001129930
##  [12,]  1.000000e+00 -5.840584e-05  0.0001129763
##  [13,]  1.000000e+00 -5.841120e-05  0.0001129819
##  [14,]  1.000000e+00 -5.841577e-05  0.0001129867
##  [15,]  1.000000e+00 -5.842609e-05  0.0001129974
##  [16,]  1.000000e+00 -5.842373e-05  0.0001129950
##  [17,]  1.000000e+00 -5.841956e-05  0.0001129906
##  [18,]  1.000000e+00 -5.841171e-05  0.0001129824
##  [19,]  1.000000e+00 -5.841665e-05  0.0001129876
##  [20,]  1.000000e+00 -5.841670e-05  0.0001129876
##  [21,]  1.000000e+00 -5.841050e-05  0.0001129812
##  [22,]  1.000000e+00 -5.840253e-05  0.0001129728
##  [23,]  1.000000e+00 -5.842345e-05  0.0001129947
##  [24,]  1.000000e+00 -5.824278e-05  0.0001128058
##  [25,]  1.000000e+00 -5.838176e-05  0.0001129511
##  [26,]  1.000000e+00 -5.838455e-05  0.0001129540
##  [27,]  1.000000e+00 -5.836150e-05  0.0001129299
##  [28,]  1.000000e+00 -5.841753e-05  0.0001129885
##  [29,]  1.000000e+00 -5.841788e-05  0.0001129889
##  [30,]  1.000000e+00 -5.839157e-05  0.0001129614
##  [31,]  1.000000e+00 -5.838656e-05  0.0001129561
##  [32,]  1.000000e+00 -5.839062e-05  0.0001129604
##  [33,]  1.000000e+00 -5.842565e-05  0.0001129970
##  [34,]  1.000000e+00 -5.842574e-05  0.0001129971
##  [35,]  1.000000e+00 -5.839813e-05  0.0001129682
##  [36,]  1.000000e+00 -5.841700e-05  0.0001129879
##  [37,]  1.000000e+00 -5.842256e-05  0.0001129938
##  [38,]  1.000000e+00 -5.842302e-05  0.0001129942
##  [39,]  1.000000e+00 -5.839421e-05  0.0001129641
##  [40,]  1.000000e+00 -5.841422e-05  0.0001129850
##  [41,]  1.000000e+00 -5.841410e-05  0.0001129849
##  [42,]  1.000000e+00 -5.816556e-05  0.0001127251
##  [43,]  1.000000e+00 -5.840571e-05  0.0001129761
##  [44,]  1.000000e+00 -5.823388e-05  0.0001127965
##  [45,]  1.000000e+00 -5.837268e-05  0.0001129416
##  [46,]  1.000000e+00 -5.836917e-05  0.0001129380
##  [47,]  1.000000e+00 -5.842006e-05  0.0001129911
##  [48,]  1.000000e+00 -5.840379e-05  0.0001129741
##  [49,]  1.000000e+00 -5.842135e-05  0.0001129925
##  [50,]  1.000000e+00 -5.841362e-05  0.0001129844
##  [51,] -2.472862e-09  1.000559e+00 -0.0003288359
##  [52,]  8.910387e-09  1.013268e+00 -0.0138986863
##  [53,]  2.867558e-08  1.020231e+00 -0.0223577880
##  [54,]  1.167077e-08  1.015363e+00 -0.0162027429
##  [55,]  3.726995e-08  1.024179e+00 -0.0269563294
##  [56,]  1.047319e-08  1.010832e+00 -0.0115809831
##  [57,]  3.679281e-08  1.023834e+00 -0.0265751513
##  [58,] -1.639820e-08  9.818170e-01  0.0194670214
##  [59,] -1.601906e-09  1.001161e+00 -0.0009959583
##  [60,]  7.345391e-09  1.012197e+00 -0.0127089579
##  [61,] -1.332268e-08  9.864060e-01  0.0146454147
##  [62,]  1.137538e-08  1.016498e+00 -0.0173153222
##  [63,] -1.336366e-08  9.862942e-01  0.0147602277
##  [64,]  2.548008e-08  1.015460e+00 -0.0173451239
##  [65,] -1.416892e-08  9.852565e-01  0.0158589302
##  [66,] -5.781792e-09  9.968169e-01  0.0036641077
##  [67,]  3.719424e-08  1.016613e+00 -0.0193850835
##  [68,] -1.537609e-08  9.833384e-01  0.0178683047
##  [69,]  5.496306e-08  8.593075e-01  0.1365632612
##  [70,] -1.299355e-08  9.868585e-01  0.0141679607
##  [71,]  1.384538e-08  5.300979e-01  0.4688637496
##  [72,] -1.020511e-08  9.909329e-01  0.0098827234
##  [73,]  2.263677e-08  6.945533e-01  0.3037541328
##  [74,]  1.161969e-09  1.002487e+00 -0.0025311070
##  [75,] -8.000526e-09  9.937768e-01  0.0068720379
##  [76,] -2.179031e-09  1.001279e+00 -0.0010703936
##  [77,]  2.326898e-08  1.018265e+00 -0.0199826773
##  [78,]  3.336351e-08  6.907487e-01  0.3067460681
##  [79,]  3.511165e-08  1.023062e+00 -0.0256764701
##  [80,] -1.695645e-08  9.809399e-01  0.0203862656
##  [81,] -1.277217e-08  9.871959e-01  0.0138138443
##  [82,] -1.559605e-08  9.830375e-01  0.0181858442
##  [83,] -1.272993e-08  9.872975e-01  0.0137090869
##  [84,] -5.246270e-08  2.095318e-01  0.7944291224
##  [85,]  4.083709e-08  1.007550e+00 -0.0105988419
##  [86,]  2.311256e-08  1.024348e+00 -0.0260531125
##  [87,]  2.006292e-08  1.020791e+00 -0.0222653977
##  [88,]  1.954302e-08  1.018959e+00 -0.0203945633
##  [89,] -8.479765e-09  9.931087e-01  0.0075763780
##  [90,]  2.931677e-09  1.006922e+00 -0.0071006405
##  [91,]  4.287825e-09  1.005695e+00 -0.0059759917
##  [92,]  1.422611e-08  1.014762e+00 -0.0157947433
##  [93,] -1.005341e-08  9.910054e-01  0.0097986617
##  [94,] -1.622328e-08  9.820903e-01  0.0191804775
##  [95,]  2.457823e-09  1.005805e+00 -0.0059472013
##  [96,] -1.182100e-08  9.884594e-01  0.0124783522
##  [97,] -4.732675e-09  9.977892e-01  0.0026123826
##  [98,] -6.463372e-09  9.957035e-01  0.0048290000
##  [99,] -1.687834e-08  9.810711e-01  0.0202492197
## [100,] -5.068345e-09  9.975287e-01  0.0028982734
## [101,]  2.847919e-08  1.611746e-02  0.9816996989
## [102,] -1.304134e-08 -2.750699e-02  1.0284656607
## [103,]  1.774635e-08  7.414355e-04  0.9978874989
## [104,] -2.429532e-08 -4.175076e-02  1.0435607624
## [105,]  2.423892e-08  9.722515e-03  0.9884153315
## [106,]  2.594084e-08  1.214019e-02  0.9858689334
## [107,] -4.599434e-08  1.106034e-01  0.8928607528
## [108,]  2.783136e-09 -1.976362e-02  1.0195243612
## [109,]  5.496684e-09 -1.547119e-02  1.0150267162
## [110,]  2.697479e-08  1.381095e-02  0.9841199809
## [111,] -3.508921e-08  2.349229e-02  0.9791397841
## [112,] -1.302953e-08 -2.782615e-02  1.0287839066
## [113,]  7.447041e-09 -1.051014e-02  1.0099183104
## [114,]  8.970559e-09 -8.870366e-03  1.0081632651
## [115,]  2.632095e-08  1.290539e-02  0.9850750002
## [116,]  1.757437e-08  1.456074e-03  0.9971859363
## [117,] -4.661298e-08 -3.335133e-02  1.0368521661
## [118,]  2.144711e-08  5.494360e-03  0.9928546292
## [119,]  2.901731e-08  1.697594e-02  0.9808005253
## [120,] -6.154912e-08  1.369757e-01  0.8676682886
## [121,]  2.229383e-08  7.152460e-03  0.9911325189
## [122,] -8.600588e-09 -2.104019e-02  1.0216629943
## [123,]  2.577050e-08  1.185397e-02  0.9861680301
## [124,] -3.973496e-08  1.159811e-01  0.8870093798
## [125,]  5.867320e-09 -1.310172e-02  1.0126293446
## [126,] -4.112807e-08 -5.058601e-02  1.0536702249
## [127,] -2.230221e-08  2.589752e-01  0.7427050161
## [128,] -2.475588e-08  2.760088e-01  0.7258584628
## [129,]  2.019108e-08  4.079302e-03  0.9943647140
## [130,] -7.527091e-08  1.104602e-01  0.8952211679
## [131,]  8.937260e-09 -1.113945e-02  1.0104347093
## [132,] -2.291614e-08 -4.145052e-02  1.0431560877
## [133,]  2.431264e-08  9.879077e-03  0.9882531972
## [134,]  1.914722e-08  7.195858e-01  0.2789876376
## [135,] -9.596970e-08  5.440509e-02  0.9528399922
## [136,]  2.585065e-08  1.212061e-02  0.9858953367
## [137,]  2.474088e-08  1.056823e-02  0.9875316572
## [138,] -5.172362e-08 -2.823743e-02  1.0321256849
## [139,] -9.754643e-09  3.652060e-01  0.6355314174
## [140,] -3.843854e-09 -1.841194e-02  1.0186746795
## [141,]  2.590057e-08  1.225655e-02  0.9857556269
## [142,]  8.985900e-09 -5.395037e-03  1.0046870189
## [143,] -1.304134e-08 -2.750699e-02  1.0284656607
## [144,]  2.519174e-08  1.114099e-02  0.9869247933
## [145,]  2.710104e-08  1.402407e-02  0.9838973128
## [146,]  1.701386e-08  1.172762e-03  0.9975116788
## [147,] -1.814548e-08 -2.283359e-02  1.0241791545
## [148,] -2.072295e-08 -2.182977e-02  1.0233706144
## [149,]  1.731377e-08  8.890183e-04  0.9977726886
## [150,] -5.314676e-08  3.570322e-02  0.9682973178
## 
## 
## $weights
## $weights[[1]]
## $weights[[1]][[1]]
##             [,1]       [,2]      [,3]
## [1,] -12.0243897 -23.856522 17.352575
## [2,]  -0.8369312  -0.165147  1.295529
## [3,]  -2.1596604 -19.994336  3.087727
## [4,]   2.5346690  16.474487 -3.954193
## [5,]   6.4882488  56.361775 -8.793004
## 
## $weights[[1]][[2]]
##               [,1]       [,2]        [,3]
## [1,]  9.999992e-01 -2.3969066  2.45818932
## [2,]  8.575603e-07  1.4343652 -1.49921218
## [3,] -1.000000e+00  0.9798511  0.02147428
## [4,]  8.106730e-07  2.3968482 -2.45807632
## 
## 
## 
## $generalized.weights
## $generalized.weights[[1]]
##                 [,1]          [,2]          [,3]          [,4]          [,5]
##   [1,] -6.493546e-06 -1.675667e-05  1.966583e-05  5.034268e-05  1.144482e-04
##   [2,] -2.259696e-05 -5.831276e-05  6.843524e-05  1.751930e-04  3.983271e-04
##   [3,] -1.346245e-05 -3.474034e-05  4.077128e-05  1.043722e-04  2.372864e-04
##   [4,] -3.015331e-05 -7.781299e-05  9.131965e-05  2.337802e-04  5.315610e-04
##   [5,] -5.689083e-06 -1.468073e-05  1.722950e-05  4.410582e-05  1.002691e-04
##   [6,] -1.667293e-05 -4.302517e-05  5.049425e-05  1.292629e-04  2.938865e-04
##   [7,] -2.342627e-05 -6.045284e-05  7.094682e-05  1.816226e-04  4.129531e-04
##   [8,] -1.128914e-05 -2.913198e-05  3.418935e-05  8.752265e-05  1.989738e-04
##   [9,] -4.260907e-05 -1.099568e-04  1.290420e-04  3.303541e-04  7.512706e-04
##  [10,] -1.226277e-05 -3.164458e-05  3.713802e-05  9.507157e-05  2.161320e-04
##  [11,] -4.226239e-06 -1.090582e-05  1.279926e-05  3.276466e-05  7.448667e-05
##  [12,] -1.719457e-05 -4.437150e-05  5.207402e-05  1.333082e-04  3.030691e-04
##  [13,] -1.284253e-05 -3.314067e-05  3.889383e-05  9.956635e-05  2.263531e-04
##  [14,] -9.123948e-06 -2.354457e-05  2.763205e-05  7.073598e-05  1.608100e-04
##  [15,] -7.396253e-07 -1.908583e-06  2.239974e-06  5.733967e-06  1.303637e-05
##  [16,] -2.654325e-06 -6.849453e-06  8.038683e-06  2.057794e-05  4.678308e-05
##  [17,] -6.050257e-06 -1.561270e-05  1.832333e-05  4.690567e-05  1.066387e-04
##  [18,] -1.242314e-05 -3.205826e-05  3.762370e-05  9.631415e-05  2.189692e-04
##  [19,] -8.414416e-06 -2.171356e-05  2.548322e-05  6.523493e-05  1.483053e-04
##  [20,] -8.374357e-06 -2.161018e-05  2.536190e-05  6.492434e-05  1.475994e-04
##  [21,] -1.340948e-05 -3.460370e-05  4.061082e-05  1.039618e-04  2.363474e-04
##  [22,] -1.988270e-05 -5.130817e-05  6.021507e-05  1.541482e-04  3.504812e-04
##  [23,] -2.885056e-06 -7.444861e-06  8.737457e-06  2.236674e-05  5.084944e-05
##  [24,] -1.494680e-04 -3.524411e-04  4.254901e-04  1.065205e-03  2.645172e-03
##  [25,] -3.677228e-05 -9.489481e-05  1.113652e-04  2.851024e-04  6.482592e-04
##  [26,] -3.449886e-05 -8.902715e-05  1.044801e-04  2.674723e-04  6.082036e-04
##  [27,] -5.323252e-05 -1.373686e-04  1.612126e-04  4.127107e-04  9.387762e-04
##  [28,] -7.694863e-06 -1.985674e-05  2.330404e-05  5.965639e-05  1.356213e-04
##  [29,] -7.411760e-06 -1.912616e-05  2.244666e-05  5.746145e-05  1.306326e-04
##  [30,] -2.879203e-05 -7.430012e-05  8.719701e-05  2.232262e-04  5.075490e-04
##  [31,] -3.286293e-05 -8.480559e-05  9.952573e-05  2.547890e-04  5.793442e-04
##  [32,] -2.956554e-05 -7.629500e-05  8.953920e-05  2.292185e-04  5.212370e-04
##  [33,] -1.100879e-06 -2.840797e-06  3.334037e-06  8.534646e-06  1.940311e-05
##  [34,] -1.024743e-06 -2.644326e-06  3.103458e-06  7.944376e-06  1.806145e-05
##  [35,] -2.345985e-05 -6.053962e-05  7.104849e-05  1.818836e-04  4.135360e-04
##  [36,] -8.129142e-06 -2.097737e-05  2.461927e-05  6.302311e-05  1.432787e-04
##  [37,] -3.606189e-06 -9.305753e-06  1.092142e-05  2.795749e-05  6.355930e-05
##  [38,] -3.233172e-06 -8.343199e-06  9.791731e-06  2.506569e-05  5.698348e-05
##  [39,] -2.664949e-05 -6.877081e-05  8.070835e-05  2.066134e-04  4.697830e-04
##  [40,] -1.038295e-05 -2.679349e-05  3.144495e-05  8.049696e-05  1.830013e-04
##  [41,] -1.048368e-05 -2.705337e-05  3.175004e-05  8.127758e-05  1.847825e-04
##  [42,] -2.099188e-04 -2.317608e-04  3.826874e-04  7.551074e-04  3.758357e-03
##  [43,] -1.730380e-05 -4.465328e-05  5.240483e-05  1.341546e-04  3.050017e-04
##  [44,] -1.567013e-04 -3.708930e-04  4.472224e-04  1.120680e-03  2.773494e-03
##  [45,] -4.414573e-05 -1.139223e-04  1.336957e-04  3.422682e-04  7.783820e-04
##  [46,] -4.700207e-05 -1.212919e-04  1.423452e-04  3.644087e-04  8.288205e-04
##  [47,] -5.639761e-06 -1.455347e-05  1.708013e-05  4.372352e-05  9.939851e-05
##  [48,] -1.885887e-05 -4.866631e-05  5.711439e-05  1.462114e-04  3.324158e-04
##  [49,] -4.595116e-06 -1.185772e-05  1.391640e-05  3.562450e-05  8.098791e-05
##  [50,] -1.087383e-05 -2.806022e-05  3.293159e-05  8.430266e-05  1.916552e-04
##  [51,]  4.339998e+00  1.157118e+01 -1.309915e+01 -3.546761e+01  2.135951e+01
##  [52,] -1.934263e+00 -5.233688e+00  5.828871e+00  1.618251e+01  1.097480e+00
##  [53,] -7.446967e-01 -2.169593e+00  2.225574e+00  6.987537e+00 -5.137218e-01
##  [54,] -1.573048e+00 -4.286211e+00  4.736766e+00  1.330690e+01  8.618184e-01
##  [55,] -6.158163e-01 -1.830962e+00  1.835984e+00  5.958712e+00 -8.215136e-01
##  [56,] -1.572123e+00 -4.356428e+00  4.725251e+00  1.365532e+01  6.479049e-01
##  [57,] -6.202473e-01 -1.843307e+00  1.849294e+00  5.997524e+00 -8.207880e-01
##  [58,]  6.251893e-02  1.625167e-01 -1.891975e-01 -4.905001e-01 -8.893243e-02
##  [59,]  6.932506e+00  1.856847e+01 -2.091373e+01 -5.707160e+01  9.882090e+00
##  [60,] -2.263914e+00 -6.094946e+00  6.825957e+00  1.879007e+01  1.272168e+00
##  [61,]  2.537104e-01  6.644318e-01 -7.671994e-01 -2.014596e+00 -3.585444e-01
##  [62,] -1.650277e+00 -4.463611e+00  4.973286e+00  1.379842e+01  9.656895e-01
##  [63,]  2.492213e-01  6.534593e-01 -7.535306e-01 -1.982790e+00 -3.453947e-01
##  [64,] -7.600461e-01 -2.236134e+00  2.268828e+00  7.238440e+00 -8.432804e-01
##  [65,]  1.958839e-01  5.109027e-01 -5.925874e-01 -1.545189e+00 -2.850480e-01
##  [66,]  1.511817e+00  3.999103e+00 -4.566826e+00 -1.219989e+01 -3.445128e+00
##  [67,] -5.353448e-01 -1.659660e+00  1.587911e+00  5.512897e+00 -2.020120e+00
##  [68,]  1.176561e-01  3.073074e-01 -3.558805e-01 -9.302492e-01 -1.620547e-01
##  [69,]  6.441594e-01  9.903096e-01 -2.031504e+00 -1.702983e+00  2.479976e+00
##  [70,]  2.783466e-01  7.297479e-01 -8.416015e-01 -2.214127e+00 -3.914664e-01
##  [71,]  5.984102e+00  1.170876e+01 -1.857108e+01 -2.810862e+01  1.908136e+00
##  [72,]  5.582901e-01  1.466734e+00 -1.687665e+00 -4.455900e+00 -8.675305e-01
##  [73,]  3.025345e+00  5.758453e+00 -9.408199e+00 -1.342182e+01  1.963872e+00
##  [74,] -1.036777e+01 -2.824671e+01  3.121984e+01  8.768855e+01  3.548624e+00
##  [75,]  8.985176e-01  2.372758e+00 -2.714682e+00 -7.231034e+00 -1.487564e+00
##  [76,]  5.072546e+00  1.349104e+01 -1.531414e+01 -4.129137e+01  9.985855e+00
##  [77,] -8.760143e-01 -2.509477e+00  2.623150e+00  8.010589e+00 -1.961789e-01
##  [78,]  2.089129e+00  3.962065e+00 -6.498487e+00 -9.197854e+00  1.999688e+00
##  [79,] -6.417084e-01 -1.899776e+00  1.914159e+00  6.169241e+00 -7.672506e-01
##  [80,]  3.417303e-02  8.866722e-02 -1.034357e-01 -2.673012e-01 -4.896989e-02
##  [81,]  2.965581e-01  7.774525e-01 -8.966701e-01 -2.358791e+00 -4.209009e-01
##  [82,]  1.058064e-01  2.758236e-01 -3.201021e-01 -8.339472e-01 -1.484903e-01
##  [83,]  3.011675e-01  7.889052e-01 -9.106830e-01 -2.392361e+00 -4.335669e-01
##  [84,] -1.060282e+00 -1.883353e+00  3.313436e+00  4.043734e+00  2.321323e+00
##  [85,] -3.968794e-01 -1.342471e+00  1.163750e+00  4.635923e+00 -7.570677e+00
##  [86,] -9.866263e-01 -2.744821e+00  2.964153e+00  8.622798e+00  3.285360e-01
##  [87,] -1.057062e+00 -2.942177e+00  3.175596e+00  9.245254e+00  3.504239e-01
##  [88,] -1.046231e+00 -2.929299e+00  3.140987e+00  9.235096e+00  2.573480e-01
##  [89,]  8.074824e-01  2.131667e+00 -2.439722e+00 -6.495025e+00 -1.285994e+00
##  [90,] -4.775664e+00 -1.281665e+01  1.440402e+01  3.943891e+01  2.007939e+00
##  [91,] -3.200457e+00 -8.755179e+00  9.633069e+00  2.724328e+01  1.585641e+00
##  [92,] -1.291500e+00 -3.584975e+00  3.881058e+00  1.124805e+01  4.903088e-01
##  [93,]  5.724656e-01  1.507280e+00 -1.730120e+00 -4.585223e+00 -8.590580e-01
##  [94,]  7.176019e-02  1.866121e-01 -2.171550e-01 -5.633608e-01 -1.020634e-01
##  [95,] -5.494303e+00 -1.479208e+01  1.656591e+01  4.560283e+01  2.183820e+00
##  [96,]  3.767060e-01  9.907493e-01 -1.138623e+00 -3.011863e+00 -5.268300e-01
##  [97,]  1.964252e+00  5.220967e+00 -5.930509e+00 -1.597369e+01 -4.919805e+00
##  [98,]  1.268504e+00  3.360103e+00 -3.831283e+00 -1.025905e+01 -2.351284e+00
##  [99,]  3.820781e-02  9.907535e-02 -1.156555e-01 -2.985637e-01 -5.522937e-02
## [100,]  1.802939e+00  4.782890e+00 -5.444586e+00 -1.461627e+01 -4.452110e+00
## [101,] -2.116376e-02 -5.494076e-02  6.405558e-02  1.656804e-01 -5.972750e-02
## [102,]  1.761897e+00  4.931147e+00 -5.289782e+00 -1.554291e+01  2.034278e-01
## [103,] -4.786524e-01 -1.264456e+00  1.446093e+00  3.854300e+00 -1.460170e+01
## [104,]  1.193268e+00  3.321103e+00 -3.584804e+00 -1.043564e+01  2.344093e-01
## [105,] -1.598011e-01 -4.184620e-01  4.832294e-01  1.268737e+00 -5.728014e-01
## [106,] -1.009931e-01 -2.630253e-01  3.055701e-01  7.947821e-01 -3.313147e-01
## [107,] -6.035911e-01 -8.096811e-01  1.917758e+00  1.016530e+00  2.750184e+00
## [108,] -6.691676e+00 -1.777458e+01  2.020507e+01  5.436001e+01  1.089438e+00
## [109,] -3.021939e+00 -8.045902e+00  9.122265e+00  2.464167e+01  1.203282e+00
## [110,] -6.605055e-02 -1.723332e-01  1.998087e-01  5.213231e-01 -1.943746e-01
## [111,]  8.328979e-02  7.890875e-01 -1.833287e-01 -3.457764e+00  5.508669e+00
## [112,]  1.773302e+00  4.956009e+00 -5.324871e+00 -1.560896e+01  2.230998e-01
## [113,] -1.937994e+00 -5.228589e+00  5.841939e+00  1.613931e+01  1.232753e+00
## [114,] -1.514565e+00 -4.077457e+00  4.566592e+00  1.257022e+01  1.434386e+00
## [115,] -8.576574e-02 -2.246741e-01  2.593406e-01  6.813477e-01 -2.511744e-01
## [116,] -4.700220e-01 -1.253745e+00  1.418568e+00  3.844009e+00 -6.283627e+00
## [117,]  5.160558e-01  1.632849e+00 -1.526737e+00 -5.475827e+00 -1.486613e+00
## [118,] -2.802001e-01 -7.340257e-01  8.472754e-01  2.226022e+00 -1.558677e+00
## [119,] -5.781285e-03 -1.495599e-02  1.750426e-02  4.500345e-02 -1.638763e-02
## [120,] -6.285320e-01 -9.735539e-01  1.981347e+00  1.697271e+00  2.746653e+00
## [121,] -2.344063e-01 -6.175917e-01  7.083793e-01  1.879505e+00 -9.707461e-01
## [122,]  2.344331e+00  6.611569e+00 -7.032396e+00 -2.092743e+01  1.035943e-01
## [123,] -1.071644e-01 -2.789563e-01  3.242595e-01  8.426546e-01 -3.596124e-01
## [124,] -7.176253e-01 -1.001011e+00  2.275468e+00  1.396456e+00  2.613337e+00
## [125,] -2.666404e+00 -7.171209e+00  8.040388e+00  2.209477e+01  1.134283e+00
## [126,]  7.752206e-01  2.229544e+00 -2.320275e+00 -7.132023e+00 -1.517838e-01
## [127,] -2.765696e+00 -5.134306e+00  8.616338e+00  1.163354e+01  2.053734e+00
## [128,] -2.625166e+00 -4.910758e+00  8.174045e+00  1.122528e+01  2.047942e+00
## [129,] -3.360035e-01 -8.854411e-01  1.015387e+00  2.694962e+00 -2.197313e+00
## [130,] -4.565594e-01 -6.525155e-01  1.445792e+00  9.651521e-01  3.208272e+00
## [131,] -1.617995e+00 -4.296361e+00  4.885595e+00  1.313700e+01  1.520784e+00
## [132,]  1.259973e+00  3.491367e+00 -3.787046e+00 -1.094363e+01  2.884502e-01
## [133,] -1.563136e-01 -4.097157e-01  4.726370e-01  1.242940e+00 -5.469294e-01
## [134,]  3.362096e+00  6.366990e+00 -1.045932e+01 -1.475694e+01  1.960930e+00
## [135,] -2.348503e-01 -2.162342e-01  7.580366e-01 -8.838771e-02  5.470159e+00
## [136,] -1.024027e-01 -2.676804e-01  3.097171e-01  8.106939e-01 -3.214675e-01
## [137,] -1.396990e-01 -3.665149e-01  4.223586e-01  1.112532e+00 -4.599301e-01
## [138,]  4.109080e-01  1.375906e+00 -1.206567e+00 -4.731136e+00 -2.438318e+00
## [139,] -7.840704e+00 -1.515356e+01  2.435542e+01  3.590917e+01  1.944138e+00
## [140,]  4.883647e+00  1.359595e+01 -1.467096e+01 -4.272811e+01  3.470244e-01
## [141,] -9.987351e-02 -2.615495e-01  3.020099e-01  7.930240e-01 -3.039947e-01
## [142,] -1.364443e+00 -3.764743e+00  4.102982e+00  1.177212e+01  1.148684e+00
## [143,]  1.761897e+00  4.931147e+00 -5.289782e+00 -1.554291e+01  2.034278e-01
## [144,] -1.250008e-01 -3.270494e-01  3.780294e-01  9.910546e-01 -4.111842e-01
## [145,] -6.183972e-02 -1.614729e-01  1.870554e-01  4.887061e-01 -1.784834e-01
## [146,] -4.977980e-01 -1.334354e+00  1.501616e+00  4.103098e+00 -7.379966e+00
## [147,]  1.171441e+00  3.435042e+00 -3.498266e+00 -1.110031e+01 -5.641637e-01
## [148,]  1.007638e+00  3.007999e+00 -3.002708e+00 -9.809102e+00 -9.027441e-01
## [149,] -4.920876e-01 -1.309803e+00  1.485500e+00  4.010756e+00 -1.096453e+01
## [150,] -7.171160e-02  3.115763e-01  2.767904e-01 -1.876372e+00  5.179857e+00
##                 [,6]          [,7]          [,8]          [,9]         [,10]
##   [1,]  2.953403e-04 -3.466081e-04 -8.873116e-04  6.185933e-05  1.596315e-04
##   [2,]  1.027938e-03 -1.206336e-03 -3.088367e-03  2.152641e-04  5.555182e-04
##   [3,]  6.123410e-04 -7.186240e-04 -1.839717e-03  1.282453e-04  3.309493e-04
##   [4,]  1.371787e-03 -1.609833e-03 -4.121476e-03  2.872465e-04  7.412893e-04
##   [5,]  2.587498e-04 -3.036665e-04 -7.773792e-04  5.419591e-05  1.398554e-04
##   [6,]  7.584060e-04 -8.900378e-04 -2.278559e-03  1.588310e-04  4.098800e-04
##   [7,]  1.065682e-03 -1.250631e-03 -3.201766e-03  2.231665e-04  5.759112e-04
##   [8,]  5.134705e-04 -6.025942e-04 -1.542668e-03  1.075408e-04  2.775189e-04
##   [9,]  1.938812e-03 -2.275220e-03 -5.825122e-03  4.059261e-04  1.047576e-03
##  [10,]  5.577518e-04 -6.545575e-04 -1.675713e-03  1.168135e-04  3.014494e-04
##  [11,]  1.922159e-04 -2.255841e-04 -5.774852e-04  4.026091e-05  1.038949e-04
##  [12,]  7.821096e-04 -9.178466e-04 -2.349788e-03  1.637932e-04  4.226893e-04
##  [13,]  5.841285e-04 -6.855124e-04 -1.754959e-03  1.223371e-04  3.157035e-04
##  [14,]  4.149830e-04 -4.870150e-04 -1.246769e-03  8.691587e-05  2.242931e-04
##  [15,]  3.364020e-05 -3.948094e-05 -1.010659e-04  7.046511e-06  1.818344e-05
##  [16,]  1.207245e-04 -1.416835e-04 -3.626973e-04  2.528712e-05  6.525380e-05
##  [17,]  2.751855e-04 -3.229571e-04 -8.267552e-04  5.763844e-05  1.487382e-04
##  [18,]  5.650690e-04 -6.631507e-04 -1.697688e-03  1.183465e-04  3.054032e-04
##  [19,]  3.827122e-04 -4.491444e-04 -1.149812e-03  8.015769e-05  2.068524e-04
##  [20,]  3.808906e-04 -4.470068e-04 -1.144339e-03  7.977622e-05  2.058679e-04
##  [21,]  6.099203e-04 -7.157803e-04 -1.832448e-03  1.277380e-04  3.296415e-04
##  [22,]  9.044569e-04 -1.061435e-03 -2.717360e-03  1.894118e-04  4.887982e-04
##  [23,]  1.312181e-04 -1.539985e-04 -3.942240e-04  2.748502e-05  7.092564e-05
##  [24,]  6.826784e-03 -8.010850e-03 -2.051163e-02  1.427770e-03  3.684847e-03
##  [25,]  1.672980e-03 -1.963249e-03 -5.026455e-03  3.502874e-04  9.039941e-04
##  [26,]  1.569585e-03 -1.841944e-03 -4.715762e-03  3.286494e-04  8.481405e-04
##  [27,]  2.422710e-03 -2.843080e-03 -7.278989e-03  5.071868e-04  1.308900e-03
##  [28,]  3.499803e-04 -4.107310e-04 -1.051473e-03  7.330264e-05  1.891624e-04
##  [29,]  3.371057e-04 -3.956225e-04 -1.012791e-03  7.060639e-05  1.822042e-04
##  [30,]  1.309821e-03 -1.537112e-03 -3.935300e-03  2.742745e-04  7.078129e-04
##  [31,]  1.495107e-03 -1.754543e-03 -4.491997e-03  3.130599e-04  8.079087e-04
##  [32,]  1.345125e-03 -1.578569e-03 -4.041331e-03  2.816682e-04  7.268830e-04
##  [33,]  5.006982e-05 -5.876271e-05 -1.504263e-04  1.048789e-05  2.706404e-05
##  [34,]  4.660755e-05 -5.469951e-05 -1.400243e-04  9.762689e-06  2.519259e-05
##  [35,]  1.067191e-03 -1.252395e-03 -3.206306e-03  2.234817e-04  5.767268e-04
##  [36,]  3.697398e-04 -4.339216e-04 -1.110836e-03  7.744105e-05  1.998414e-04
##  [37,]  1.640166e-04 -1.924905e-04 -4.927630e-04  3.435471e-05  8.865328e-05
##  [38,]  1.470480e-04 -1.725755e-04 -4.417841e-04  3.080052e-05  7.948182e-05
##  [39,]  1.212348e-03 -1.422739e-03 -3.642429e-03  2.538708e-04  6.551522e-04
##  [40,]  4.722508e-04 -5.542215e-04 -1.418826e-03  9.890887e-05  2.552427e-04
##  [41,]  4.768449e-04 -5.596160e-04 -1.432624e-03  9.987134e-05  2.577252e-04
##  [42,]  9.700032e-03 -1.138208e-02 -2.914507e-02  2.027404e-03  5.232555e-03
##  [43,]  7.870939e-04 -9.236998e-04 -2.364757e-03  1.648374e-04  4.253822e-04
##  [44,]  7.157910e-03 -8.399479e-03 -2.150642e-02  1.496926e-03  3.863298e-03
##  [45,]  2.008781e-03 -2.357327e-03 -6.035349e-03  4.205688e-04  1.085365e-03
##  [46,]  2.138936e-03 -2.510081e-03 -6.426373e-03  4.478082e-04  1.155656e-03
##  [47,]  2.565038e-04 -3.010298e-04 -7.706323e-04  5.372541e-05  1.386415e-04
##  [48,]  8.578421e-04 -1.006723e-03 -2.577320e-03  1.796507e-04  4.636113e-04
##  [49,]  2.089930e-04 -2.452732e-04 -6.278901e-04  4.377477e-05  1.129627e-04
##  [50,]  4.945830e-04 -5.804300e-04 -1.485920e-03  1.035857e-04  2.673116e-04
##  [51,]  5.997612e+01 -6.410465e+01 -1.893856e+02 -3.881821e+01 -1.086487e+02
##  [52,]  3.307024e+00 -3.266732e+00 -1.083468e+01 -1.139592e+00 -3.405433e+00
##  [53,] -3.071802e-01  1.678066e+00 -1.005477e+00  3.930933e-01  7.121993e-02
##  [54,]  2.726094e+00 -2.549760e+00 -9.140869e+00 -9.008788e-01 -2.812628e+00
##  [55,] -1.043559e+00  2.617167e+00  1.097260e+00  6.720272e-01  7.467384e-01
##  [56,]  2.519296e+00 -1.860483e+00 -9.173175e+00 -7.119482e-01 -2.649387e+00
##  [57,] -1.035658e+00  2.615693e+00  1.062111e+00  6.707447e-01  7.379967e-01
##  [58,] -2.327167e-01  2.689465e-01  7.052659e-01  8.723735e-02  2.282107e-01
##  [59,]  2.826964e+01 -2.959577e+01 -9.017378e+01 -1.236130e+01 -3.520816e+01
##  [60,]  3.727869e+00 -3.799369e+00 -1.204236e+01 -1.318056e+00 -3.839035e+00
##  [61,] -9.538304e-01  1.082426e+00  2.919781e+00  3.508759e-01  9.326953e-01
##  [62,]  2.903336e+00 -2.875234e+00 -9.501449e+00 -1.000031e+00 -2.982231e+00
##  [63,] -9.215430e-01  1.042404e+00  2.825894e+00  3.383855e-01  9.020425e-01
##  [64,] -8.569983e-01  2.712223e+00  7.719741e-02  6.670557e-01  5.178348e-01
##  [65,] -7.512234e-01  8.613948e-01  2.286565e+00  2.787499e-01  7.342583e-01
##  [66,] -9.417594e+00  1.037034e+01  2.929216e+01  3.175454e+00  8.664390e+00
##  [67,] -3.401159e+00  6.335444e+00  6.787877e+00  1.650061e+00  2.670010e+00
##  [68,] -4.284448e-01  4.895544e-01  1.306619e+00  1.590934e-01  4.203663e-01
##  [69,]  5.496311e+00 -7.619082e+00 -1.480250e+01 -2.565306e+00 -5.669987e+00
##  [70,] -1.044191e+00  1.181482e+00  3.201485e+00  3.830681e-01  1.020894e+00
##  [71,]  4.310393e+00 -5.852475e+00 -1.178816e+01 -1.933686e+00 -4.360492e+00
##  [72,] -2.326244e+00  2.616822e+00  7.154621e+00  8.406340e-01  2.251729e+00
##  [73,]  4.421484e+00 -6.025204e+00 -1.205993e+01 -1.994401e+00 -4.481682e+00
##  [74,]  1.121493e+01 -1.050010e+01 -3.758929e+01 -3.845395e+00 -1.199611e+01
##  [75,] -4.045803e+00  4.480257e+00  1.254711e+01  1.427676e+00  3.876619e+00
##  [76,]  2.778852e+01 -2.999992e+01 -8.731043e+01 -1.271338e+01 -3.528241e+01
##  [77,]  4.147060e-01  7.046733e-01 -2.989593e+00  1.032580e-01 -5.953712e-01
##  [78,]  4.500294e+00 -6.135308e+00 -1.227092e+01 -2.033417e+00 -4.567413e+00
##  [79,] -9.068096e-01  2.452437e+00  6.926250e-01  6.225325e-01  6.205227e-01
##  [80,] -1.276659e-01  1.481504e-01  3.860099e-01  4.804048e-02  1.252157e-01
##  [81,] -1.122558e+00  1.270336e+00  3.441492e+00  4.115639e-01  1.096699e+00
##  [82,] -3.909021e-01  4.487792e-01  1.189020e+00  1.456641e-01  3.832825e-01
##  [83,] -1.154082e+00  1.308834e+00  3.534005e+00  4.235898e-01  1.126616e+00
##  [84,]  5.191546e+00 -7.126036e+00 -1.408501e+01 -2.379884e+00 -5.310645e+00
##  [85,] -1.448289e+01  2.353450e+01  3.394373e+01  5.261574e+00  9.897112e+00
##  [86,]  1.392684e+00 -9.295743e-01 -5.215765e+00 -3.711044e-01 -1.479002e+00
##  [87,]  1.503727e+00 -9.893132e-01 -5.652679e+00 -3.972864e-01 -1.598421e+00
##  [88,]  1.350651e+00 -6.969757e-01 -5.357709e+00 -3.132927e-01 -1.462073e+00
##  [89,] -3.494277e+00  3.873564e+00  1.083074e+01  1.239408e+00  3.362364e+00
##  [90,]  5.794797e+00 -6.007472e+00 -1.857068e+01 -2.105337e+00 -6.045900e+00
##  [91,]  5.190782e+00 -4.670233e+00 -1.767567e+01 -1.683433e+00 -5.417901e+00
##  [92,]  1.973709e+00 -1.399874e+00 -7.271049e+00 -5.444706e-01 -2.083356e+00
##  [93,] -2.316998e+00  2.589649e+00  7.150731e+00  8.340704e-01  2.246720e+00
##  [94,] -2.672924e-01  3.086309e-01  8.104514e-01  1.001050e-01  2.620774e-01
##  [95,]  6.399826e+00 -6.521987e+00 -2.067460e+01 -2.302029e+00 -6.705504e+00
##  [96,] -1.416684e+00  1.588650e+00  4.364480e+00  5.150341e-01  1.383348e+00
##  [97,] -1.366147e+01  1.478380e+01  4.287235e+01  4.435073e+00  1.228334e+01
##  [98,] -6.455056e+00  7.074396e+00  2.012690e+01  2.222171e+00  6.088134e+00
##  [99,] -1.438118e-01  1.671082e-01  4.345046e-01  5.415607e-02  1.409904e-01
## [100,] -1.228170e+01  1.338812e+01  3.839972e+01  4.037010e+00  1.111126e+01
## [101,] -1.558693e-01  1.806769e-01  4.715812e-01  5.525739e-02  1.441690e-01
## [102,]  1.049700e+00 -5.530992e-01 -4.147190e+00 -2.558215e-01 -1.179695e+00
## [103,] -3.975514e+01  4.397245e+01  1.233670e+02  5.436691e+00  1.477744e+01
## [104,]  1.004446e+00 -6.619553e-01 -3.774180e+00 -2.725640e-01 -1.095402e+00
## [105,] -1.523513e+00  1.729292e+00  4.663089e+00  5.072189e-01  1.348026e+00
## [106,] -8.704028e-01  1.001540e+00  2.644217e+00  2.994340e-01  7.863246e-01
## [107,]  6.062876e+00 -8.453100e+00 -1.625718e+01 -2.849871e+00 -6.263668e+00
## [108,]  3.018254e+00 -3.274545e+00 -9.459669e+00 -1.173828e+00 -3.243960e+00
## [109,]  3.361485e+00 -3.613389e+00 -1.058449e+01 -1.321809e+00 -3.681786e+00
## [110,] -5.126585e-01  5.873385e-01  1.561162e+00  1.780268e-01  4.692965e-01
## [111,]  1.161827e+01 -1.699482e+01 -2.998832e+01 -6.175787e+00 -1.294547e+01
## [112,]  1.087660e+00 -6.142130e-01 -4.236991e+00 -2.745537e-01 -1.215569e+00
## [113,]  3.653092e+00 -3.676770e+00 -1.186869e+01 -1.416108e+00 -4.167475e+00
## [114,]  4.202813e+00 -4.283887e+00 -1.357592e+01 -1.684645e+00 -4.906342e+00
## [115,] -6.686679e-01  7.582250e-01  2.047734e+00  2.292448e-01  6.097930e-01
## [116,] -1.767539e+01  1.885480e+01  5.586789e+01  3.478459e+00  9.752217e+00
## [117,] -2.631697e+00  4.646815e+00  5.625914e+00  1.293073e+00  2.222626e+00
## [118,] -4.149546e+00  4.705198e+00  1.270777e+01  1.264592e+00  3.363895e+00
## [119,] -4.248112e-02  4.960716e-02  1.279923e-01  1.519627e-02  3.938921e-02
## [120,]  6.089551e+00 -8.438112e+00 -1.640507e+01 -2.853145e+00 -6.308603e+00
## [121,] -2.623009e+00  2.925762e+00  8.103787e+00  8.293301e-01  2.237862e+00
## [122,]  9.327051e-01 -2.338720e-01 -4.061954e+00 -1.695432e-01 -1.099858e+00
## [123,] -9.437273e-01  1.087204e+00  2.865081e+00  3.241170e-01  8.502516e-01
## [124,]  5.768737e+00 -8.031575e+00 -1.548518e+01 -2.694818e+00 -5.931091e+00
## [125,]  3.306864e+00 -3.389608e+00 -1.065408e+01 -1.269822e+00 -3.681082e+00
## [126,]  1.325849e-01  5.226080e-01 -1.390947e+00  9.993274e-02 -2.473966e-01
## [127,]  4.611525e+00 -6.302373e+00 -1.255164e+01 -2.086769e+00 -4.676195e+00
## [128,]  4.602099e+00 -6.284170e+00 -1.253377e+01 -2.081177e+00 -4.667470e+00
## [129,] -5.940437e+00  6.622170e+00  1.835873e+01  1.684674e+00  4.548257e+00
## [130,]  7.088324e+00 -9.859233e+00 -1.904137e+01 -3.388279e+00 -7.464382e+00
## [131,]  4.208649e+00 -4.571607e+00 -1.318237e+01 -1.728384e+00 -4.771524e+00
## [132,]  1.121124e+00 -8.283530e-01 -4.081600e+00 -3.251529e-01 -1.209580e+00
## [133,] -1.458035e+00  1.650784e+00  4.468804e+00  4.855990e-01  1.293428e+00
## [134,]  4.412357e+00 -6.016476e+00 -1.202960e+01 -1.991156e+00 -4.471733e+00
## [135,]  1.196562e+01 -1.682457e+01 -3.187767e+01 -6.300089e+00 -1.373292e+01
## [136,] -8.514583e-01  9.709413e-01  2.599545e+00  2.912018e-01  7.707951e-01
## [137,] -1.229000e+00  1.387849e+00  3.772122e+00  4.118211e-01  1.099414e+00
## [138,] -4.629808e+00  7.584023e+00  1.076218e+01  2.086433e+00  3.891282e+00
## [139,]  4.384905e+00 -5.963716e+00 -1.197719e+01 -1.970681e+00 -4.436701e+00
## [140,]  1.497896e+00 -9.786650e-01 -5.640727e+00 -4.167647e-01 -1.684397e+00
## [141,] -8.086748e-01  9.177477e-01  2.475374e+00  2.760332e-01  7.337158e-01
## [142,]  4.158587e+00 -3.335448e+00 -1.475520e+01 -1.520234e+00 -5.333907e+00
## [143,]  1.049700e+00 -5.530992e-01 -4.147190e+00 -2.558215e-01 -1.179695e+00
## [144,] -1.091402e+00  1.241638e+00  3.336377e+00  3.695012e-01  9.800631e-01
## [145,] -4.715690e-01  5.392214e-01  1.437560e+00  1.637727e-01  4.324541e-01
## [146,] -2.118267e+01  2.209368e+01  6.768894e+01  3.740990e+00  1.068870e+01
## [147,] -5.089352e-01  1.822237e+00 -2.637935e-01  4.670217e-01  2.893208e-01
## [148,] -1.209262e+00  2.868446e+00  1.511948e+00  7.757733e-01  9.303637e-01
## [149,] -3.059841e+01  3.292971e+01  9.629074e+01  4.672339e+00  1.300153e+01
## [150,]  1.109973e+01 -1.595940e+01 -2.905520e+01 -5.818392e+00 -1.240703e+01
##                [,11]         [,12]
##   [1,] -1.873419e-04 -4.795919e-04
##   [2,] -6.519285e-04 -1.669013e-03
##   [3,] -3.883921e-04 -9.943034e-04
##   [4,] -8.699263e-04 -2.227169e-03
##   [5,] -1.641331e-04 -4.201766e-04
##   [6,] -4.810210e-04 -1.231445e-03
##   [7,] -6.758610e-04 -1.730282e-03
##   [8,] -3.256886e-04 -8.337760e-04
##   [9,] -1.229346e-03 -3.147417e-03
##  [10,] -3.537707e-04 -9.056758e-04
##  [11,] -1.219309e-04 -3.121372e-04
##  [12,] -4.960488e-04 -1.269936e-03
##  [13,] -3.704989e-04 -9.485006e-04
##  [14,] -2.632258e-04 -6.738624e-04
##  [15,] -2.134052e-05 -5.462882e-05
##  [16,] -7.658256e-05 -1.960445e-04
##  [17,] -1.745590e-04 -4.468624e-04
##  [18,] -3.584138e-04 -9.175497e-04
##  [19,] -2.427586e-04 -6.214625e-04
##  [20,] -2.416034e-04 -6.185046e-04
##  [21,] -3.868557e-04 -9.903762e-04
##  [22,] -5.736351e-04 -1.468549e-03
##  [23,] -8.323890e-05 -2.130848e-04
##  [24,] -4.323976e-03 -1.107140e-02
##  [25,] -1.060844e-03 -2.716040e-03
##  [26,] -9.953146e-04 -2.548206e-03
##  [27,] -1.536014e-03 -3.932562e-03
##  [28,] -2.219980e-04 -5.683151e-04
##  [29,] -2.138325e-04 -5.474092e-04
##  [30,] -8.306404e-04 -2.126591e-03
##  [31,] -9.481016e-04 -2.427331e-03
##  [32,] -8.530336e-04 -2.183866e-03
##  [33,] -3.176278e-05 -8.130929e-05
##  [34,] -2.956652e-05 -7.568673e-05
##  [35,] -6.768153e-04 -1.732737e-03
##  [36,] -2.345313e-04 -6.003978e-04
##  [37,] -1.040439e-04 -2.663452e-04
##  [38,] -9.327991e-05 -2.387914e-04
##  [39,] -7.688486e-04 -1.968364e-03
##  [40,] -2.995466e-04 -7.668484e-04
##  [41,] -3.024616e-04 -7.743043e-04
##  [42,] -6.139937e-03 -1.572189e-02
##  [43,] -4.992112e-04 -1.278024e-03
##  [44,] -4.533416e-03 -1.160751e-02
##  [45,] -1.273691e-03 -3.260959e-03
##  [46,] -1.356186e-03 -3.472132e-03
##  [47,] -1.627082e-04 -4.165302e-04
##  [48,] -5.440732e-04 -1.392883e-03
##  [49,] -1.325726e-04 -3.393805e-04
##  [50,] -3.137104e-04 -8.031078e-04
##  [51,]  1.165441e+02  3.424687e+02
##  [52,]  3.395498e+00  1.111091e+01
##  [53,] -1.303699e+00  1.571768e+00
##  [54,]  2.669767e+00  9.373844e+00
##  [55,] -2.153770e+00 -3.738904e-01
##  [56,]  2.058662e+00  9.497421e+00
##  [57,] -2.150538e+00 -3.373426e-01
##  [58,] -2.638288e-01 -6.914787e-01
##  [59,]  3.703918e+01  1.120435e+02
##  [60,]  3.939211e+00  1.236262e+01
##  [61,] -1.059363e+00 -2.853734e+00
##  [62,]  2.980405e+00  9.720064e+00
##  [63,] -1.021346e+00 -2.764632e+00
##  [64,] -2.164649e+00  7.229174e-01
##  [65,] -8.424063e-01 -2.234246e+00
##  [66,] -9.560501e+00 -2.692078e+01
##  [67,] -5.187848e+00 -5.014982e+00
##  [68,] -4.806383e-01 -1.281522e+00
##  [69,]  7.883088e+00  1.523621e+01
##  [70,] -1.156243e+00 -3.128417e+00
##  [71,]  5.931754e+00  1.190869e+01
##  [72,] -2.535979e+00 -6.921082e+00
##  [73,]  6.119891e+00  1.220560e+01
##  [74,]  1.139703e+01  3.996528e+01
##  [75,] -4.300641e+00 -1.201110e+01
##  [76,]  3.820559e+01  1.106870e+02
##  [77,] -4.161645e-01  3.420441e+00
##  [78,]  6.239847e+00  1.243486e+01
##  [79,] -2.003694e+00  2.452588e-03
##  [80,] -1.453419e-01 -3.785505e-01
##  [81,] -1.242270e+00 -3.360462e+00
##  [82,] -4.402590e-01 -1.165510e+00
##  [83,] -1.278825e+00 -3.448229e+00
##  [84,]  7.307232e+00  1.438220e+01
##  [85,] -1.637655e+01 -2.276588e+01
##  [86,]  1.061318e+00  5.430538e+00
##  [87,]  1.134386e+00  5.887495e+00
##  [88,]  8.703595e-01  5.630085e+00
##  [89,] -3.733881e+00 -1.041229e+01
##  [90,]  6.302472e+00  1.932463e+01
##  [91,]  4.969426e+00  1.831024e+01
##  [92,]  1.567519e+00  7.543442e+00
##  [93,] -2.514669e+00 -6.928616e+00
##  [94,] -3.027193e-01 -7.944784e-01
##  [95,]  6.879906e+00  2.159417e+01
##  [96,] -1.553274e+00 -4.258845e+00
##  [97,] -1.333105e+01 -3.849101e+01
##  [98,] -6.687425e+00 -1.896065e+01
##  [99,] -1.638639e-01 -4.259300e-01
## [100,] -1.214290e+01 -3.469538e+01
## [101,] -1.671589e-01 -4.361169e-01
## [102,]  7.123997e-01  4.527881e+00
## [103,] -1.637535e+01 -4.581260e+01
## [104,]  7.784081e-01  4.033236e+00
## [105,] -1.531425e+00 -4.124033e+00
## [106,] -9.052057e-01 -2.388192e+00
## [107,]  8.761778e+00  1.675354e+01
## [108,]  3.529171e+00  1.015279e+01
## [109,]  3.970614e+00  1.157412e+01
## [110,] -5.379699e-01 -1.428661e+00
## [111,]  1.906253e+01  3.322911e+01
## [112,]  7.706266e-01  4.612059e+00
## [113,]  4.227115e+00  1.349214e+01
## [114,]  5.034874e+00  1.579886e+01
## [115,] -6.920851e-01 -1.866527e+00
## [116,] -1.044144e+01 -3.076819e+01
## [117,] -4.049833e+00 -4.568007e+00
## [118,] -3.817767e+00 -1.029674e+01
## [119,] -4.600120e-02 -1.186699e-01
## [120,]  8.767316e+00  1.695762e+01
## [121,] -2.499908e+00 -6.908393e+00
## [122,]  4.339632e-01  4.558404e+00
## [123,] -9.799307e-01 -2.580690e+00
## [124,]  8.284092e+00  1.588224e+01
## [125,]  3.797156e+00  1.182467e+01
## [126,] -3.632964e-01  1.699923e+00
## [127,]  6.404893e+00  1.270697e+01
## [128,]  6.387270e+00  1.269156e+01
## [129,] -5.077952e+00 -1.404493e+01
## [130,]  1.041500e+01  2.000370e+01
## [131,]  5.197066e+00  1.492487e+01
## [132,]  9.402586e-01  4.335508e+00
## [133,] -1.465805e+00 -3.962257e+00
## [134,]  6.110252e+00  1.217273e+01
## [135,]  1.938297e+01  3.647848e+01
## [136,] -8.795886e-01 -2.352354e+00
## [137,] -1.242803e+00 -3.372501e+00
## [138,] -6.497982e+00 -8.864355e+00
## [139,]  6.046108e+00  1.210118e+01
## [140,]  1.189090e+00  6.213413e+00
## [141,] -8.334024e-01 -2.244863e+00
## [142,]  4.434707e+00  1.869622e+01
## [143,]  7.123997e-01  4.527881e+00
## [144,] -1.115854e+00 -2.994727e+00
## [145,] -4.948081e-01 -1.317862e+00
## [146,] -1.120543e+01 -3.407211e+01
## [147,] -1.524312e+00  8.647783e-01
## [148,] -2.478061e+00 -7.663458e-01
## [149,] -1.403690e+01 -4.084922e+01
## [150,]  1.793408e+01  3.233814e+01
## 
## 
## $startweights
## $startweights[[1]]
## $startweights[[1]][[1]]
##            [,1]         [,2]       [,3]
## [1,]  1.2629543 -1.539950042  0.7635935
## [2,] -0.3262334 -0.928567035 -0.7990092
## [3,]  1.3297993 -0.294720447 -1.1476570
## [4,]  1.2724293 -0.005767173 -0.2894616
## [5,]  0.4146414  2.404653389 -0.2992151
## 
## $startweights[[1]][[2]]
##            [,1]       [,2]        [,3]
## [1,] -0.4115108 -1.2375384  0.80418951
## [2,]  0.2522234 -0.2242679 -0.05710677
## [3,] -0.8919211  0.3773956  0.50360797
## [4,]  0.4356833  0.1333364  1.08576936
## 
## 
## 
## $result.matrix
##                                   [,1]
## error                     1.934716e+00
## reached.threshold         9.909091e-03
## steps                     2.665400e+04
## Intercept.to.1layhid1    -1.202439e+01
## Sepal.Length.to.1layhid1 -8.369312e-01
## Sepal.Width.to.1layhid1  -2.159660e+00
## Petal.Length.to.1layhid1  2.534669e+00
## Petal.Width.to.1layhid1   6.488249e+00
## Intercept.to.1layhid2    -2.385652e+01
## Sepal.Length.to.1layhid2 -1.651470e-01
## Sepal.Width.to.1layhid2  -1.999434e+01
## Petal.Length.to.1layhid2  1.647449e+01
## Petal.Width.to.1layhid2   5.636178e+01
## Intercept.to.1layhid3     1.735257e+01
## Sepal.Length.to.1layhid3  1.295529e+00
## Sepal.Width.to.1layhid3   3.087727e+00
## Petal.Length.to.1layhid3 -3.954193e+00
## Petal.Width.to.1layhid3  -8.793004e+00
## Intercept.to.setosa       9.999992e-01
## 1layhid1.to.setosa        8.575603e-07
## 1layhid2.to.setosa       -1.000000e+00
## 1layhid3.to.setosa        8.106730e-07
## Intercept.to.versicolor  -2.396907e+00
## 1layhid1.to.versicolor    1.434365e+00
## 1layhid2.to.versicolor    9.798511e-01
## 1layhid3.to.versicolor    2.396848e+00
## Intercept.to.virginica    2.458189e+00
## 1layhid1.to.virginica    -1.499212e+00
## 1layhid2.to.virginica     2.147428e-02
## 1layhid3.to.virginica    -2.458076e+00
## 
## attr(,"class")
## [1] "nn"

Plot for Iris NN

Plotting the Neural Network:

plot(nn_iris, rep="best")

NN on Iris data using caret

Now, doing the same thing using caret:

set.seed(0)
nn_caret <- caret::train(Species~., data = iris, 
                         method = "nnet", linout = TRUE, 
                         trace = FALSE)
ps <- predict(nn_caret, iris)
confusionMatrix(ps, iris$Species)$overall["Accuracy"]
##  Accuracy 
## 0.9733333

Plot caret NN

Plotting the `caret’ neural network:

NeuralNetTools::plotnet(nn_caret)  

Additional practice

The TBnanostring.rds dataset contains gene expression measurements in the blood for 107 TB-related genes for 179 patients with either active tuberculosis infection (TB) or latent TB infection (LTBI) from one of Dr. Johnson’s publications. When you Load these data into R ( TBnanostring <- readRDS("TBnanostring.rds")) the TB status is found in the first column of the data frame, followed by the genes in the subsequent columns. The rows represent each individual patient.

Here is a UMAP clustering of the dataset, and plot the result using ggplot. The points are colored based on TB status.

Split the dataset into “training” and “testing” sets using a 70/30 partition, using set.seed(0) and the createDataPartition function from the caret package (code is ‘hiding’ in the .Rmd file!). Apply the following machine learning methods to make a predictive biomarker to distinguish between the TB and control samples, use the caret package and cross validation to find the “finalModel” parameters to for each method.

Now, using the caret::train() function, apply the following machine learning methods to make a predictive biomarker to distinguish between the TB and control samples, use the caret package and cross validation to find the “finalModel” parameters to for each method. Provide any relevant/informative plots with your results.

  1. Split the dataset into “training” and “testing” sets using a 70/30 partition (use set.seed(0) and the caret::createDataPartition).
  2. Apply a Support Vector Machine to these data (try linear, radial, and polynomial kernels).
  3. Apply a Random Forest Model to these data.
  4. Apply a Feedforward Perceptron Neural Network to these data.
  5. Compare the overall accuracy of the prediction methods for each of the machine learning tools in the previous problem. Which one performs the best?

(Note: the TBnanostring.Rmd and TBnanostring.html files provide suggested solutions for these analyses)

Session Info

sessionInfo()
## R version 4.4.0 (2024-04-24)
## Platform: aarch64-apple-darwin20
## Running under: macOS Sonoma 14.2.1
## 
## Matrix products: default
## BLAS:   /Library/Frameworks/R.framework/Versions/4.4-arm64/Resources/lib/libRblas.0.dylib 
## LAPACK: /Library/Frameworks/R.framework/Versions/4.4-arm64/Resources/lib/libRlapack.dylib;  LAPACK version 3.12.0
## 
## locale:
## [1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8
## 
## time zone: Africa/Kampala
## tzcode source: internal
## 
## attached base packages:
## [1] stats     graphics  grDevices utils     datasets  methods   base     
## 
## other attached packages:
##  [1] neuralnet_1.44.2     randomForest_4.7-1.2 mda_0.5-4           
##  [4] class_7.3-22         gridExtra_2.3        e1071_1.7-16        
##  [7] lubridate_1.9.3      forcats_1.0.0        stringr_1.5.1       
## [10] dplyr_1.1.4          purrr_1.0.2          readr_2.1.5         
## [13] tidyr_1.3.1          tibble_3.2.1         tidyverse_2.0.0     
## [16] DT_0.33              caret_6.0-94         lattice_0.22-6      
## [19] ggplot2_3.5.1        umap_0.2.10.0       
## 
## loaded via a namespace (and not attached):
##  [1] tidyselect_1.2.1     timeDate_4041.110    farver_2.1.2        
##  [4] fastmap_1.2.0        pROC_1.18.5          digest_0.6.37       
##  [7] rpart_4.1.23         timechange_0.3.0     lifecycle_1.0.4     
## [10] survival_3.7-0       magrittr_2.0.3       compiler_4.4.0      
## [13] rlang_1.1.4          sass_0.4.9           tools_4.4.0         
## [16] utf8_1.2.4           yaml_2.3.10          data.table_1.16.0   
## [19] knitr_1.48           labeling_0.4.3       askpass_1.2.0       
## [22] htmlwidgets_1.6.4    reticulate_1.39.0    plyr_1.8.9          
## [25] withr_3.0.1          nnet_7.3-19          grid_4.4.0          
## [28] stats4_4.4.0         fansi_1.0.6          colorspace_2.1-1    
## [31] future_1.34.0        globals_0.16.3       scales_1.3.0        
## [34] iterators_1.0.14     MASS_7.3-61          cli_3.6.3           
## [37] rmarkdown_2.28       generics_0.1.3       rstudioapi_0.16.0   
## [40] future.apply_1.11.2  RSpectra_0.16-2      tzdb_0.4.0          
## [43] reshape2_1.4.4       proxy_0.4-27         cachem_1.1.0        
## [46] splines_4.4.0        parallel_4.4.0       vctrs_0.6.5         
## [49] hardhat_1.4.0        Matrix_1.7-0         jsonlite_1.8.9      
## [52] hms_1.1.3            listenv_0.9.1        crosstalk_1.2.1     
## [55] foreach_1.5.2        gower_1.0.1          jquerylib_0.1.4     
## [58] recipes_1.1.0        glue_1.8.0           parallelly_1.38.0   
## [61] codetools_0.2-20     stringi_1.8.4        gtable_0.3.5        
## [64] munsell_0.5.1        pillar_1.9.0         htmltools_0.5.8.1   
## [67] ipred_0.9-15         openssl_2.2.2        lava_1.8.0          
## [70] R6_2.5.1             NeuralNetTools_1.5.3 evaluate_1.0.0      
## [73] highr_0.11           png_0.1-8            bslib_0.8.0         
## [76] Rcpp_1.0.13          nlme_3.1-166         prodlim_2024.06.25  
## [79] xfun_0.47            pkgconfig_2.0.3      ModelMetrics_1.2.2.2